People should doubt everything an LLM outputs, ergo, why use it in the first place if the desired output is objective fact? LLMs hallucinate, that's what they do. When it's wrong, you likely won't notice that it's wrong, but over time, your world view is going to become more and more distorted.