Hacker News new | past | comments | ask | show | jobs | submit login

Do children really lack input data though? Human sensory input is quite a lot of data. Our languages may have common structure because that structure reflects causality and physics encountered by directly sampling the real world.



Your hypothesis is plausible, but less likely than Chomsky’s IMO. One basic version of the typical chomskian response to this point is: many animals have the same input data, how come were the only ones who have evolved any capacity for language at all? The very best animals at language are apes, and we have to successfully teach one concepts that humans learn while still wearing diapers


Why can't a small neural network do what LLMs do? Same reason animals can't learn human language. They don't have the capacity.

LLMs started getting interesting "all the sudden" when they hit a certain scale, just like biological brains.


I mean, that's kinda exactly Chomsky's point, minus the implication (perhaps malinferred) that human-like linguistic is an inevitable or universal characteristic of intelligent minds.

There's lots of animals that have very complex brains, and that engage in very complex behaviors - the fact that none of them have even a hint of linguistic ability seems like a strong indicator that these faculties are a human-specific evolution, and not just... well IDK how to even sum up an anti-UG/GG view. "Kids just sorta figure it out" I guess?

Although who knows! As Chomsky likes to say, this whole field is in a pre-Galilean state due to the impossibility of conducting comparative studies

EDIT: Oh just realized you were the parent comment. Well I'd say the small NN vs. LLM example still doesn't convince me of the likelihood of your statement as I understand it; it goes without saying that lots of animals are much better at intuitive understandings of physics (well, kinetics at least) than humans. You ever seen those snakes that jump from tree to tree? craziest shit you'll ever see


You have Hellen Keller who went blind and deaf at 19 months.


And how exactly is the structure of the world transformed into a structure of language? Chomsky would say through a universal grammar, which is a framework through which you can set up ways to encode the structure in the world as words.


This implies that the combinatorial space of possible grammars is large. If it isn't, then the structure of the universe would imply a narrow set of possible workable grammars.

The combinatorial space of languages is obviously infinite, but is this so for grammars? If not then you would expect many languages to share the same grammars.

Seems analogous to the same question about mathematics. Are there many different possible arithmetics? No. There are infinitely many ways to express arithmetic symbolically but there is only one arithmetic. 2 + 2 never equals 5.


Keep this as a secret! You are over-scientific and over-intellegent against humanities folks gethered here!


That’s not in the hacker news spirit :( especially ironic to shit talk “humanities folks” with a basic misspelling in your comment. We’re all just trying to reach the truth!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: