Hacker Newsnew | past | comments | ask | show | jobs | submit | kyma's commentslogin

HyperNEAT is also really cool: https://en.wikipedia.org/wiki/HyperNEAT

It uses NEAT to evolve a CPPN which is then sampled at some specified resolution to determine the actual network topology. The really cool thing is the sampling resolution can be varied to scale the size of the network while maintaining the overall "shape" of the topology. I took Ken's neuroevolution course back in the day and worked with HyperNEAT before deep learning got big. Ever since, deep learning network architectures have always reminded me of the dense layered topologies that result from higher sampling resolutions with HyperNEAT. It would be interesting to see if HyperNEAT along with Ken's novelty search technique could be used to evolve useful deep learning architectures.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: