Hacker News new | past | comments | ask | show | jobs | submit login

Hopfield was the first to poke at Ising model spin to create memory-having systems, creating his Hopfield model where memory states are represented as attractors in the Hamiltonian search space. Add hidden units to that, and you get the general Boltzmann machine. Restrict the connections, you get a restricted Boltzmann machine. Layer them up, you get the 2006 Hinton and Salakhutdinov advance in deep learning.



And if you want to solve SAT with neural networks, you usually go back to a modified version of the Hopfield network.

Rumelhart was also inspired by poking at MAX-CSP, at least according to McClelland.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: