Hacker News new | past | comments | ask | show | jobs | submit login

> It may also prompt some computer scientists to reappraise strategies for artificial neural networks, which have traditionally been built based on a view of neurons as simple, unintelligent switches.

It has been suggested that neural networks have diverged from biological principles to the point where biological research does not provide any useful improvement to current machine learning techniques. Back propagation of errors was a major advancement in the 1970s, and it was developed purely based on mathematical principles like statistics, differential equations, and calculus. As far as I have read, error back propagation is not how biological NNs learn, and it seems to be a much more efficient strategy. Biology seems much more brute force in comparison.




> It may also prompt some computer scientists to reappraise strategies for artificial neural networks, which have traditionally been built based on a view of neurons as simple, unintelligent switches.

I find this is a conveniently simplistic view of the current state of machine learning, which they wiggle out of by saying "traditionally built". Anyone working on ML and neural networks these days knows that modern networks are built out of building blocks -- certain standard architecture archetypes (MLP vs CNN, autoencoders, GANs, etc), resnet blocks, training regimes (layer pre-training, progressive training, etc), normalization methods, and so on. Sure, the term "neural networks" originally implied that each linear-non-linear operation pair represented a single neuron, but it's not a stretch to say that if neurons are more powerful than we thought, then each neuron represents, say, a small MLP block, or a resnet block. The biological analogy still stands. Citing ideas of ANNs from two decades ago to promise how new results in biology may change "current" thinking in ML (by citing decades-old ideas while calling them "traditional" no less!) is disingenuous.


> Biology seems much more brute force in comparison.

Biology is more decentralized for sure, but I wouldn't be so fast with calling it "brute force". One interesting feature of biological neural networks is the variety of neurotransmitters. In very simple terms, they are not only capable of exciting or inhibiting neurons from firing, but they are also capable of regulating how neurons adapt to the stimulae they have been exposed to. In other words, the biological network not only learns, but it runs the learning algorithm itself. It appears to do this by creating various layers of communication with different effects on the computing nodes.

In my view, we have not been smart enough yet to figure out nature's algorithm.


>Biology seems much more brute force in comparison.

A singly honeybee with its billion synapses can adapt to a variety of situations and environments, learn on-the-fly, perform complex sequences of tasks, cooperate and communicate with other bees. All of these capabilities are emergent and packed in its tiny head.

State of the art artificial neural networks (ranging beyond billions of parameters now) only do the thing they're specifically built for, only after training with bazillion specific examples and consume tons of energy while doing so.

Which one of these sounds like the brute force approach?


The majority of behavior in organisms like bees is instinctual, not learned in their lifetime. That training required millions of billions of trials over the course of hundreds of millions of years.


>The majority of behavior in organisms like bees is instinctual, not learned in their lifetime.

It doesn't matter whether their behavior is considered "instinctual". What matters is that they can quickly adapt their behavior to entirely novel scenarios:

https://science.sciencemag.org/content/355/6327/833




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: