PageRank performs fairly poorly on the web as a whole these days, because the nature of linking has changed since 1998. If you constrain it to the parts of the web that is still basically hypertext documents, it's as good as ever.
That's largely what I've been doing with my search engine.
Have you considered widening the definition of 'hyperlink'? For example, using references in text as well. Or also (lightweight) sentiment analysis. For example, I'm sure many people refer to wikipedia all the time, without giving a link. Or Hacker News. This seems like it should represent a "weak link" between those pages. It would probably be worthwhile to use some kind of normalization on pages/domains to avoid spamming to game this system.
I also think it's worth exploring a simple reputation system. Why not have reputable users evaluate results? (approve/disapprove buttons) I think almost any reputation-free system will eventually fail to bots or cost a huge amount to win the tug of war against SEO. Reputation mostly solves the issue.
Speaking of reputation, I think the great insight would be to apply a ranking algorithms to reputations themselves -- if you can't trust your users, you fall back on the same problem.
To rank reputations, clearly PageRank doesn't work because it values all users equally, which is unfortunately not sybil-resistant. I think one approach is to have an "invite system", where your reputation is associated to who invited you, with earlier users having greater weight somehow (also the administrator can manually assign trustworthy users).
This also suggests a way to formulate distributed trust. You can join a "trust network" by trusting a certain user(s), and then you import users who they trust as well. (I believe this is the rough idea behind Web of Trust, although I believe WoT is not algorithmic -- it should have been!) The problem with this approach for a practical search engine is that you can't aggregate results (you would need to store each user's vote and compute a personal ranking every time) -- so I think in practice a useful compromise is to give the user a choice of a few "trust sets". You trust <Public User A> here? Join this trust set. You trust <Public User B>? Join this other trust set. (Combining a small number of trust sets should be trivial)
As for an algorithm, something like:
-- By trusting other users, up to 100*(1-sqrt(N)/N)% [note] of your trust points will be redistributed (diminishing the impact of your choices).
[note] Obs: Formula arbitrary, chosen to approach 100%
-- The total trust conserves, and phenomena like cyclic trust are not a problem due to conservation.
Does PageRank perform poorly because the nature of linking has changed, or has the nature of linking changed because PageRank is no longer the main arbiter of search results?
That's largely what I've been doing with my search engine.