Hacker News new | past | comments | ask | show | jobs | submit login

“Because the REP was only a de-facto standard for the past 25 years, different implementers implement parsing of robots.txt slightly differently, leading to confusion. This project aims to fix that by releasing the parser that Google uses.”

The amount of arrogance in this sentence is insane.

Because Google way is the only one true way?




In terms of “what should a robots.txt file look like to be parsed correctly,” yes, because they’re the ones who are going to be doing most of that parsing. Yes, ideally it would be an entirely independent standardization process, but it’s not arrogant of them.


Because google is the only search engine most people care about....




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: