Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the case with plenty of commenters here too, and why I push back so much against the people anthropomorphizing and attributing thought and reasoning to LLMs. Even highly technical people operating in a context where they should know better simply can’t—or won’t, here—keep themselves from doing so.

Ed Zitron is right. Ceterum censeo, LLMs esse delenda.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: