Sure, it's not lying, you're right, there's no will there, I'm anthropomorphism. It is producing entirely wrong facts / pseudo-opinions (as it can't actually have an opinion).
I was about to suggest "pathologically dishonest", but then I looked up the term and that seems to require being biased in favour of the speaker and knowing that you're saying falsehoods.
"Confabulate" however, appears to be a good description. Confabulation is, I'm told, associated with Alzheimer's, and GPT's output does sometimes remind me of a few things my mum said while she was ill.
Lying requires knowledge that what you are saying is not the truth, and usually there's a motive for doing so.
I don't think ChatGPT is there yet... or is it?