When we give information we don't know to be true -which is what the model does when it builds low confidence tokens on each other- we call it guessing. Why would this be any different?
Hallucination, on the other hand, is evocative of a less reasonable/rational source.
When we give information we don't know to be true -which is what the model does when it builds low confidence tokens on each other- we call it guessing. Why would this be any different?