Hacker News new | past | comments | ask | show | jobs | submit login

Wait, this can actually have consequences! Think about all the SEO articles about ChatGPT hallucinating… At some point it will start to “think” that it should hallucinate and give nonsensical answers often, as it is ChatGPT.



I wouldn’t draw that conclusion yet, but I suppose it is possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: