Hacker News new | past | comments | ask | show | jobs | submit login

Looks cool, but care to share how you manage to process the entire web, which is around 200 million domains and billions of web pages? Sounds like a herculean task for a startup your size.



Yes, it is challenging to index and process billions of web pages on a regular basis. Our current architecture can be scaled to handle 500+ million web-pages. However, to increase the crawl frequency and introduce more services, we are building a distributed computing network - called Newton Network (newton.network). We hope this will give us enough processing power to power our ambitions ;)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: