Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm well aware of the distinction between training a model and running it. Look, GPT-3 has 185 billion parameters. Modern low-power CPUs will get you about 2GFLOPs/watt [1]. So even if all GPT-3 did was add its parameters together it would take multiple seconds on an equivalently powered CPU to do something that our brains do easily in real time. It's not an issue of processing power; an 8086 from 40 years ago easily runs circles around us in terms of raw computational power. Rather, it's that our brains are wired in a fundamentally different way than all existing neural networks, and because of that, this line of research will never lead to GAI, not even if you threw unlimited computing power at it.

[1] http://web.eece.maine.edu/~vweaver/group/green_machines.html



Birds are wired in a fundamentally different way than all our existing computers thus we will never have fly-by-wire, not even if we throw unlimited computing power at it.


Actually that's a great example. For centuries men labored (and died) trying to build ornithopters--machines that flap their wings like birds--under the mistaken impression that this was the secret to flight. Finally, after hundres of years of progressively larger and more powerful, but ultimately failing designs, the Wright brothers came along and showed us that flight is the result of wing shape and pressure differentials, and has nothing whatsover to do with flapping.

GPT-3 and whatever succeeds it are like late-stage ornithopters: very impressive feats of engineering, but not ultimately destined to lead us to where their creators hoped. We need the Wright brothers of AI to come and show us the way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: