The mammalian brain have had a few hundred million years to evolve neural plasticity [1] which is the key function missing in AI. The brain’s structure isn’t set in stone but develops over one’s lifetime and can even carry out major restructuring on a short time scale in some cases of massive brain damage.
Neural plasticity is the algorithm running on top of our neural networks that optimizes their structure as we learn so not only do we get more data, but our brains get better tailored to handle that kind of data. This process continues from birth to death and physical experimentation in youth is a key part of that development, as is social experimentation in social animals.
I think “it remains unclear” only to the ML field, from the perspective of neuroscientists, current neural networks aren’t even superficially at the complexity of axon-dendrite connections with ion channels and threshold potentials, let alone the whole system.
A family member’s doctoral thesis was on the potentiation of signals and based on my understanding if it, every neuron takes part in the process with its own “memory” of sorts and the potentiation she studied was just one tiny piece of the neural plasticity story. We’d need to turn every component in the hidden layers of a neural network into it’s own massive NN with its own memory to even begin to approach that kind of complexity.
Neural plasticity is the algorithm running on top of our neural networks that optimizes their structure as we learn so not only do we get more data, but our brains get better tailored to handle that kind of data. This process continues from birth to death and physical experimentation in youth is a key part of that development, as is social experimentation in social animals.
I think “it remains unclear” only to the ML field, from the perspective of neuroscientists, current neural networks aren’t even superficially at the complexity of axon-dendrite connections with ion channels and threshold potentials, let alone the whole system.
A family member’s doctoral thesis was on the potentiation of signals and based on my understanding if it, every neuron takes part in the process with its own “memory” of sorts and the potentiation she studied was just one tiny piece of the neural plasticity story. We’d need to turn every component in the hidden layers of a neural network into it’s own massive NN with its own memory to even begin to approach that kind of complexity.
[1] https://en.m.wikipedia.org/wiki/Neuroplasticity