Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't know if I buy this. It feels like your confidence in what you say is closely tied to "knowing". I'm sure there is more research to do here, but I'm not sure if there is a need to "tie" it to some other system. As it stands today there are definitely things ChatGPT doesn't know and will tell you so. For example, I asked it, why did Donald Trump spank his kids -- and it said, "I do not have information about the parenting practices of Donald Trump".

That said, there are a lot of things it does get wrong, it would be nice for it be better at those. But I do think that, maybe much like humans, there will always be statements it makes, which are not true.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: