Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And the renaissance man has tons of structure encoded in his brain on birth already. Just like GPT-3 does before you give it a prompt. I'm not saying this is fully equivalent (clearly a baby can't spout correct Latex just by seeing three samples), but you simply cannot just handwave away thousands of years of human evolution and millions of years of general evolution before that.

The renaissance man is very obviously not working solely based on a few years of reading books (or learning to speak/write).



A person who is never taught to read will never be able to respond to written text. So the renaissance-era man is working "solely" based on their lived experience with text, which compared to GPT(n) is tiny.

Ah! you cry. Don't humans have some sort of hard-wiring for speech and language? Perhaps. But it is clearly completely incapable of enabling an untrained human to deal with written text. Does it give the human a head start in learning to deal with written text? Perhaps (maybe even probably). It demonstrably takes much less training than GPT(n) does.

But that is sort of the point of the comment at the top of this chain.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: