Hacker News new | past | comments | ask | show | jobs | submit login

they are not. the current consensus in computational neuroscience is that a neuron can be equivalent to a 2-layer or 3-layer ANN with point nonlinearities.

Edit: i am blocked from posting, so here are the refs:

https://www.ncbi.nlm.nih.gov/pubmed/20800473

https://www.ncbi.nlm.nih.gov/pubmed/25554708

https://sci-hub.tw/https://www.sciencedirect.com/science/art...




Do you have any references you could share on this? I'd like to read more, thanks!


Minsky and Papert showed that computing XOR requires a 3 layer artificial neural net: Input, Hidden Layer, Output - which has 2 layers of weighted connections.

This[0] research shows a single neuron can compute XOR, thus the artificial neuron has less computational power than a real one.

[0] - https://www.reddit.com/r/MachineLearning/comments/ejbwvb/r_s...


That's exactly what TFA is about ;)


Ah interesting thank you




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: