You're attempting to set goal posts for a logical argument, like we're talking about religion or politics, and you've skipped the part about mutually agreeing on definitions. Define what an LLM is, in technical terms, and you will have your answer about why it is not intelligent, and not capable of reasoning. It is a statistical language model that predicts the next token of a plausible response, one token at a time. No matter how you dress it up, that's all it can ever do, by definition. The evidence or data that would change my mind is if instead of talking about LLMs, we were talking about some other technology that does not yet exist, but that is fundamentally different than an LLM.
If we defined "LLM" as "any deep learning model which uses the GPT transformer architecture and is trained using autoregressive next-token prediction", and then we empirically observed that such a model proved the Riemann Hypothesis before any human mathematician, it would seem very silly to say that it was "not intelligent and not capable of reasoning" because of an a-priori logical argument. To be clear, I think that probably won't happen! But I think it's ultimately an empirical question, not a logical or philosophical one. (Unless there's some sort of actual mathematical proof that would set upper bounds on the capabilities of such a model, which would be extremely interesting if true! but I haven't seen one.)
Let's talk when we've got LLMs proving the Riemann Hypothesis (or any mathematical hypothesis) without any proofs in the training data. I'm confident in my belief that an LLM can't do that, and will never be able to. LLMs can barely solve elementary school math problems reliably.
If the cure for cancer arrived to us in the form of the most probable token being predicted one at a time, would your view on the matter change in any way?
In other words, do you have proof that this medium of information output is doomed to forever be useless in producing information that adds value to the world?
These are of course rhetorical questions that you nor anyone else can answer today, but you seem to have a weird sort of absolute position on this matter, as if a lot depended on your sentiment being correct.