Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How are you dealing with it just manufacturing answers it does not readily have? Or is that pretty much the same as SEO spam and easy to filter out?


GPT4 is much better at it. So far, I haven't seen it hallucinate. GPT3 hallucinates terribly, but not that often, and it's fairly predictable in what kinds of questions it's more inclined to hallucinate.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: