"Forget Torch, Tensorflow, and Theano. I decided to implement Backprop NEAT in Javascript, because it is considered the best language for Deep Learning according to the Data Science Dojo."
That's a bold statement with no elaboration. I would expect a link to DSDs statement at the least. Based on what metrics? How does Javascript go about accessing GPUs for training?
Given Neuro Evolution evolves efficient structure, this suggests training NEAT on a dataset produced by pre-trained deep networks could distill highly optimised functions.
Evolving specialised components like LSTM is an intruiging possibility.
Great to see Karpathy's Recurrent.js making prototyping easy and immediately accessible.
Ken Stanley's innovation markers to allow succesful crossover of augmenting Neuro Topologies is powerful tool.
Very nice write-up. Several years ago, I took a graduate class that covered NEAT and ended up doing a little project [0] to see if you could apply similar ideas (recurrent nets evolved via NEAT). My idea was to use it for multi-agent control problems, though, where you want agents to try and "teach" other agents whenever they learn something useful.
Back in 2009, I have implmemted NEAT in c#, and applied it to solve Torcs CIG2009 challenge. It worked well, but took a lot of time to train (reinforcement learning).
IMO neat will not be able to practically compete with with something like deep Q network on visual signal. I recommend you to Take a look on cig2016 challenges.
That's a bold statement with no elaboration. I would expect a link to DSDs statement at the least. Based on what metrics? How does Javascript go about accessing GPUs for training?