Hacker News new | past | comments | ask | show | jobs | submit login

People should doubt everything an LLM outputs, ergo, why use it in the first place if the desired output is objective fact? LLMs hallucinate, that's what they do. When it's wrong, you likely won't notice that it's wrong, but over time, your world view is going to become more and more distorted.



> why use it in the first place if the desired output is objective fact

rewriting facts is like 90% of all writing jobs




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: