I mean have you seen the whole "encode a poem in made up emoji" followed by "decode the poem from emoji" stuff? I think it's not unreasonable to think with the right prompts LLMs could do this.
The LLMs that are doing this are trained on trillions of examples of human language. This is not remotely a counterexample. Now if an LLM can invent a new full language from scratch, without any training data of existing languages (like how AlphaZero learned to play Go), then that will be impressive, and a difference in kind.
Um but you have the example of English. Modern English was based on Middle English, which in turn is based on Old English, but greatly influenced by Norman on account of the invasion, as well as by Norse
"In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6.
“What are you doing?”, asked Minsky.
“I am training a randomly wired neural net to play Tic-Tac-Toe” Sussman replied.
“Why is the net wired randomly?”, asked Minsky.
“I do not want it to have any preconceptions of how to play”, Sussman said.
Minsky then shut his eyes.
“Why do you close your eyes?”, Sussman asked his teacher.
“So that the room will be empty.”
At that moment, Sussman was enlightened."
LLMs are not doing this.