Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Crawlers ignoring robots.txt is abusive. That they then start scanning all docs for commented urls just adds to the pile of scummy behaviour.


Human behavior is interesting - me, me, me…




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: