Hacker News new | past | comments | ask | show | jobs | submit login

Let everyone mark sites as high or low quality, and then count other peoples ranking in proportion to how well the correlate with your ranking.

(A vague idea I've wanted to implement for awhile now for both search engines, and for HN/reddit like sites... but the amount of effort to do it well would be really high)




I would guess that, apart from the immense effort of building it, delivering personalised search results like this would be enormously expensive in storage for the search engine. Much more than sorting people into a few cohorts/buckets.

But FFS, it's 2021, we deserve some decent search engine results.

I doubt Google would do it unless they absolutely had to, so I hope you or someone else forces their hand and shows them that it's time.


> I would guess that, apart from the immense effort of building it, delivering personalised search results like this would be enormously expensive in storage for the search engine.

How expensive would it really be?

You have O(the_internet) in pages and metadata, and you have O(world_population) in user preferences. So long as your index structure allows those to be mostly decoupled (if I had to take a first crack at it I'd probably try to embed preferences and pages into a vector space and build a projection index -- exact matches are hard in that system, but decent personalized results are easy), I don't think it'd be all that much more space than a non-personalized search engine, especially given that the world population is kind of small compared to the size of the internet.

For that matter, the web isn't thaaat big (ignoring images and video). The entire common crawl can fit on a single $3k-$5k disk uncompressed.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: