Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>one of the primary reasons for AV's demise -- we didn't update our primary index for several months just as Google [...] I'd say that our failure to maintain a high quality index was directly caused by our loss of focus,

Do you believe Google's strategic decision to use commodity computers and hard drives gave them any competitive advantage (cheaper cost, scaling, etc) compared to DEC Alpha servers?

As an outsider, it seems like Google could iterate its data centers faster and cheaper and therefore, their web crawlers were cheaper to run (also run more frequently), also cheaper to store terabytes of data, and also cheaper to service search queries.



Nah. We weren't using that many servers. Today, that absolutely would make a difference. Back in the day we could run a top ten web site on well under 500 servers, and it's not like we were paying list price for Alphas anyhow.


Years ago, I was told by a drunk* ex-Digital engineer at a lisp meetup that one of the big reasons that Alpha died was that y'all were getting yields of something like 6 wafers/chip, vs. Intel's 97% for the Pentium. Given that, those 500 alphas still must have cost a pretty penny to produce.

* I was also not exactly sober at the time, so these numbers may be a bit off. The number of wafers per chip being greater than 1, though, I am absolutely certain about.


Not the OP, but I remember the early 2000s. Just spitballing here but IMO that made no difference whatsoever from a consumer's standpoint -- but it presumably did from an operational standpoint, given how Google introduced an actual business model to search. The only things that mattered to you as a consumer then was how good the results were, and how convenient it was to get them. Google had a clear edge by the end of 2000 insofar as I can recollect.


>but IMO that made no difference whatsoever from a consumer's standpoint.

With cheaper techniques, the idea is that the "more capital efficient" way of indexing the ever-expanding web would in turn provide better results for an improved consumer experience. It's the old adage of "do more with less".

For example, see the old Danny Sullivan graphs[0] showing how Google's index was growing faster than AltaVista. Having a bigger index lets one return more relevant search hits.

AltaVista wasn't just falling behind in "staleness" of old indexes; the aggregate size of the index was smaller than Google as well.

[0] https://searchenginewatch.com/sew/study/2068075/search-engin...


I'm not so sure it applied back then. Before Google, the core issue was to get a good result in the random garbage you were returned in search results. You'd use quotes and plus/minus or AND/OR operators, maybe strip out words like xxx and porn and warez, and hope for the best. Staleness was, frankly, of little concern if you got a few relevant results. That the AV index was stale was news to me before I read this thread, and I'm not sure I'd buy into the idea that it made much difference. Search engine toolbars made getting results more convenient. But the core of the problem then was getting any relevant results to begin with. For that, Google just rocked.


> I'm not so sure it applied back then

It did apply, to a point. Before Google, I had switched to AllTheWeb as my search engine of choice since a lot of sites just wouldn't show up in AltaVista no matter what you searched for, and ATW had a bigger index (I guess staleness could have had the same result).

But of course eventually I switched to Google for the better search results.


True, but after the period I'm thinking of. By 2001 when the layoffs started hitting hard, Google already had a massive advantage.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: