Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I find it a bit surprising that I'm being called an "LLM fanboy" for writing an article with the title "Hallucinations in code are the least dangerous form of LLM mistakes" where the bulk of the article is about how you can't trust LLMs not to make far more serious and hard-to-spot logic errors.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: