I think this is interesting in the context of chatgtp. A model trained on language - which is itself a limited model of reality.
If the purpose of language is reason and reason convincing and persuading other people, winning arguments with other people, defending and justifying actions and decisions to other people - then what is chatGTP - essentially a reason engine? An engine designed to convince people that it know’s best regardless of the truth? Is it actually a tool of control?