More like animal codes. Humans are the only species known to be biologically capable of language (expect for one controversial case of a chimp maaaaybe doing it). To be a language you need things like syntax and recursion.
Why do you need syntax and recursion? From a very mathematical point of view, languages are just a set of words. And human language rules are not really grammatical rules for the most part - there are an endless number of exceptions.
Because that is the definition of a natural language. All human languages exhibit these features (except debatably Pirahã, but that's a weird case and is literally the only known example out of ~7000 languages).
It's like the difference between a finite state machine and a Turing machine. For simple cases it may appear like they are equivalent, and with enough pattern matching and brute force you can always get by, but one will always be fundamentally more powerful than the other. This is the same with human language and animal communication. Animal brains are fundamentally incapable of grasping these concepts.
> From a very mathematical point of view, languages are just a set of words. And human language rules are not really grammatical rules for the most part - there are an endless number of exceptions.
I would say they're more like a complex partly logical partly statistical model than a set of words. Grammar is much, much more complicated than adjective-noun-verb. Once you create a good, comprehensive model of it for a language, true exceptions are very rare. Even accepting that there are expceptions, it doesn't make sense to say that just because humans don't use language with 100% predictability and rigor that it's equivalent to a random bag of words. This is true of programming and logical languages too. You need syntax which must be adhered to at least the vast majority of the time. Deviations have to be limited and understandable. Any sentence somebody says will be at least 95% known grammar with only a tiny bit of true innovation.
> Because that is the definition of a natural language. All human languages exhibit these features (except debatably Pirahã, but that's a weird case and is literally the only known example out of ~7000 languages).
I am no linguists, my question was genuine curiosity, sorry if I sounded argumentative.
> This is the same with human language and animal communication. Animal brains are fundamentally incapable of grasping these concepts.
I agree with it on a fundamental level (with perhaps the exception of chimpanzees? I would not be surprised if they could get closer to us. It probably depends on whether language is the fundamental distinctive feature of intelligence or not).
Regarding bag of words, I meant to use word as combinations of characters (including whitespace), so an English sentence would be a “word” in the English language with this definition. And yeah, just because it is not a formal language, doesn’t mean it doesn’t have other sort of structure.
> I am no linguists, my question was genuine curiosity, sorry if I sounded argumentative.
Oh sorry, I didn't mean to come across as argumentative either.
> I would not be surprised if they could get closer to us. It probably depends on whether language is the fundamental distinctive feature of intelligence or not.
Yeah I'd really like to see more research done on this. The chimp I referenced is actually a bonobo called Kanzi, I misremembered the species. He seemed to be a prodigy able to do things no other apes can. It would be interesting to know how far this could go if we found even more talented apes, or selectively bred them.
That is certainly not the "mathematical point of view" of what language is. Not sure if I completely agree with the other commenter, but languages are not just "a set of words".