> In particular, our peripheral nervous system is capable of absorbing information from the environment at much higher rates, on the order of gigabits/s. This defines a paradox: The vast gulf between the tiny information throughput of human behavior, and the huge information inputs on which the behavior is based. This enormous ratio – about
100,000,000 – remains largely unexplained
The GPU is capable of performing billions of operations per second, yet Cyberpunk barely runs at 60 fps. And there is no paradox at all.
By the way, the brain seems to perform better than a GPU at tasks like image recognition. Probably because it does even more operations per second than the GPU.
There is also another comparison. Imagine if your goal is to calculate an integral over 100 dimensional space (or solve a quantum system) and answer whether it is larger or less than zero. This will take enourmous time but produces a single bit of information.
Is the brain better than a GPU at image recognition nowadays? Actually I’m not sure how that’s measured. Certainly a GPU could be tied to a database with a lot more things in it, like you can get some pretty impressive facial recognition demos where it’ll recognize a ton of random people.
But humans can see objects they’ve never seen before and sometimes guess what they might be used for, which is sort of like object recognition but better. (Or sometimes I see an object I’m technically familiar with, like an old tool of my grandpa’s, and remembering what he used it for feels more like imagining… maybe it is).
The GPU is capable of performing billions of operations per second, yet Cyberpunk barely runs at 60 fps. And there is no paradox at all.
By the way, the brain seems to perform better than a GPU at tasks like image recognition. Probably because it does even more operations per second than the GPU.