Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
visarga
on Oct 7, 2022
|
parent
|
context
|
favorite
| on:
The AI Scaling Hypothesis
The brain has about 1T synapses and GPT-3 has 175B parameters, even though a parameter is much simpler than a synapse. So the scale of the brain is at least 5700x that of GPT-3. It seems normal to have to compensate by using 200x more training data.
Filligree
on Oct 8, 2022
[–]
This does make me wonder, what would happen if we could feed the brain data at the same rate that GPT-3 was able?
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: