This is a pseudophilosophical mumbo-jumbo. It does not really address the comment you replied to, because it does not contradict any of the following statements (from which my original point trivially follows):
1. Chomsky claimed syntax can't be modeled statistically.
2. GPT is a nearly perfect statistical model of syntax.
The point is very basic: These "models" don't tell you anything about the human language faculty. They can be useful tools but don't serve science.
Chomky's point is that there is a lot of evidence that humans don't use a statistical process to produce language and these statistical "models" don't tell you anything about the human language faculty.
Whether your 1 & 2 are meaningful depend on how you define "model" which is the real issue at hand: Do you want to understand something (science) --- in which case the model should explain something --- or do you want a useful tool (engineering) --- in which case it can essentially be a black box.
I don't know why you care to argue about this though; my impression is that you don't really care about how human's do language so why does it matter to you?
Re: meaningfulness. Your scientific vs engineering model distinction is not how "scientific model" is defined. It includes both. The existence of the model itself does explain something, specifically, that statistics can model language. That alone is explanatory power, so the claim that it doesn't explain anything is a lie. Therefore it is both an "engineering" model (because it can predict syntax) and scientific (because it demonstrates statistical approach to language has predictive powers in scientific sense).
Science is about understanding the natural world, if you want to redefine it to mean something else fine but the point still stands: LLMs do not explain anything about the natural world, specifically anything about the human language faculty. Again it's clear you do not care about this! Instead you want to spend time arguing to make sure labels you like are applied to things you like.
> the fact that you don't understand how GPT models language does not make it less of a model.
E.g. the fact that a Pythagorean theorem does not explain anything about the natural world to a slug does not make Pythagorean theorem any less sciency.
Science is not about explanatory power or else the Pythagorean theorem is not science due to the above, which is obviously nonsense.
> E.g. the fact that a Pythagorean theorem does not explain anything about the natural world to a slug does not make Pythagorean theorem any less sciency.
In fact it does! Math is not science! There is a reason it is STEM and not S
1. Chomsky claimed syntax can't be modeled statistically.
2. GPT is a nearly perfect statistical model of syntax.