Say, they should be 100% confident that "0.3" follows "0.2 + 0.1 =", but a lot of floating point examples on the internet make them less confident.
On a much more nuanced problem, "0.30000000000000004" may get more and more confidence.
This is what makes them "hallucinate", did I get it wrong? (in other words, am I hallucinating myself? :) )
Say, they should be 100% confident that "0.3" follows "0.2 + 0.1 =", but a lot of floating point examples on the internet make them less confident.
On a much more nuanced problem, "0.30000000000000004" may get more and more confidence.
This is what makes them "hallucinate", did I get it wrong? (in other words, am I hallucinating myself? :) )