Hacker News new | past | comments | ask | show | jobs | submit login

The brain has about 1T synapses and GPT-3 has 175B parameters, even though a parameter is much simpler than a synapse. So the scale of the brain is at least 5700x that of GPT-3. It seems normal to have to compensate by using 200x more training data.



This does make me wonder, what would happen if we could feed the brain data at the same rate that GPT-3 was able?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: