Hacker News new | past | comments | ask | show | jobs | submit login

From what little I've read, deep learning is a new way of training nets. Deep neural nets have also been around for 30 years, but backpropagation, their traditional learning algorithm, isn't as efficient in training the middle layers.



You should keep reading. Backprop is still the best, and pretty much the only way to train neural nets (deep or not). Other ways exist (e.g. weight perturbation), but no one uses them. EDIT: I have to clarify: backprop is just one half of the training algorithm, you also gradient descent, and there are many variants of it.


Man, I've been out of Neural Nets a long time. I studied them back in the early 1990s. Perceptrons and backpropagation. Didn't really keep up with the state of the art, I'm afraid. Maybe I should catch up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: