Are Google using a neural nets as an integral part of search indexing yet?
It's well known that there are a bunch of metrics that go into which results to return — metrics including things like pagerank, (probably) historical value (# of clicks when the page appears in results), and social media popularity.
I wouldn't be surprised if Google has experimented with training models to predict most of those metrics, given only content from the site itself, and tried using those models as a filter for what to index in the first place. If the NN is accurate enough, they can use it as a filter at indexing stage ("should I index this?") rather than at the results ranking stage (where real data, rather than NN model output, answers the question "should I show this page close enough to the top of results that someone will see it?").
From bits and pieces they've posted, it sounds like they use some all-encompassing glob of statistical inference that they call RankBrain, which almost certainly includes some deep-learning components. They've said that the old PageRank algorithm is now one input into RankBrain.
It's well known that there are a bunch of metrics that go into which results to return — metrics including things like pagerank, (probably) historical value (# of clicks when the page appears in results), and social media popularity.
I wouldn't be surprised if Google has experimented with training models to predict most of those metrics, given only content from the site itself, and tried using those models as a filter for what to index in the first place. If the NN is accurate enough, they can use it as a filter at indexing stage ("should I index this?") rather than at the results ranking stage (where real data, rather than NN model output, answers the question "should I show this page close enough to the top of results that someone will see it?").