Hacker News new | past | comments | ask | show | jobs | submit login

The fact that the paper does not mention the word "hallucinations" in the full body text makes me think that the authors aren't fully familiar with the state of LLMs as of 2024.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: