Hacker News new | past | comments | ask | show | jobs | submit login

Major parts of that aren't true. I think something like 30% of queries have never been seen before, and Google provides low latency on those as well.



Cut him some slack, it was clearly a simplification. Also, you're wrong to declare him inaccurate.

Even if 30% of queries are truly unique (a number that people have quoted from years ago, may no longer be true) caching 70% of the queries is a big win. Also, instant search is heavily relying on Google Suggest, which by it's very nature, are queries which have been performed already, so are trivially cacheable.


I've always used lengthy, quoted search phrases, but what's really interesting is that Google's autocomplete has slowly trained monkey-me to follow the well-worn path and use the search terms it suggests.

It's a great example of using the UI to encourage a desired behavior.


So, let's just turn off caching and triple or quadruple the size of the Google cluster, eh? No big deal. I completely fail to see how you think that disproves my point.


I should probably mention that I work in search quality at Google. The part I took issue with is "Google can take a couple of seconds if you throw a truly novel query at it." which is definitely not true.

Google can't cache nearly as much as you would expect, not because the queries change, but because the results change.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: