Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The deeper issue is that ChatGPT cannot accurately determine whether it "knows" something or not.

If its training data includes rants by flat-earthers, then it may "know" that the earth is flat (in addition to "knowing" that it is round).

ChatGPT does not have a single, consistent model of the world. It has a bulk of training data that may be ample in one area, deficient in another, and strongly self-contradictory in a third.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: