Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Does it lie? Or just get things wrong sometimes?

Lying requires knowledge that what you are saying is not the truth, and usually there's a motive for doing so.

I don't think ChatGPT is there yet... or is it?



Technically, what ChatGPT is doing is bullshitting because it doesn't have any knowledge of or concern for truthfulness.

https://en.m.wikipedia.org/wiki/On_Bullshit


Sure, it's not lying, you're right, there's no will there, I'm anthropomorphism. It is producing entirely wrong facts / pseudo-opinions (as it can't actually have an opinion).


I was about to suggest "pathologically dishonest", but then I looked up the term and that seems to require being biased in favour of the speaker and knowing that you're saying falsehoods.

"Confabulate" however, appears to be a good description. Confabulation is, I'm told, associated with Alzheimer's, and GPT's output does sometimes remind me of a few things my mum said while she was ill.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: