Hacker News new | past | comments | ask | show | jobs | submit login

Any thoughts on why genetic programming is not 'in fashion'? Does it have anything to do with complexity of the calculations?

I can imagine that the advanced models use many, many machines and only deliver results after a large training time. Genetic programming is not feasible then, if you cannot get a quick grasp of the potential results of a model.




At least for deep learning, most deep learning models take more than a week to train, often on multiple GPUs. Some of the extremely deep, huge dataset models can take multiple weeks on multiple GPUs. Google trained AlphaGo's nets for months (on god knows how many GPU/CPUs). Suffice to say, people don't even bother touching most hyperparameters, let alone trying to do something more exhaustive.


If your program is a neural network with N parameters, or a program tree with N nodes, then testing against data takes O(N) time. With evolutionary computation, what you get for your trouble is a single real number -- the loss: how bad it did. With neural networks, backpropagation gives you N real numbers: the gradient of loss with respect to each parameter.

Put another way: with evolution you have to stumble around blindly in parameter space and rely on selection to keep you moving in the right direction. With the gradient descent that neural networks use, you get, essentially for free, knowledge of the (locally) best direction to move in parameter space.

The bigger the models, the more this matters. Modern neural networks have millions or even billions of parameters, and that's been crucial to their expressive power. Good luck learning a program tree with a billion nodes using evolution. It might take 4.54 billion years.


> It might take 4.54 billion years.

And then only if you have a system powerful enough to accurately simulate a planet full of molecules.

Although I do think there is a balance between GA and structured NN which will lead to faster and better results than the deep NN alone. We already see some of the best deep NNs incorporating specific structures.


I think neural networks and other forms of evolutionary computation will merge as I have been writing in my other replies in this thread. TWEANNs incorporate EC into evolving ANNs. The other article I cited above on soil mechanics, beat out expert systems, ANNs, statistics, and used GP. MEP, or Multi-Expression Programming for GP incorporates being able to put more than one solution into a gene without increasing the processing times thereby overcoming the inefficiencies of 1990s-era GP. Here is a recent article using it that is not behind a paywall or via sci-hub.io [1]. It needs better editing, but there are other references if you search for Multi-expression Genetic Programming.

  [1] http://benthamopen.com/ABSTRACT/TOPEJ-9-21


First, the right tool for the job. ANNs are able to be a general function approximator with sufficient training to be a cost-effective choice to implement. Second, ANNs have been around about 35 years longer than GP. The TWEANNs I am studying, and that I already mentioned in a previous reply in this thread, hybridize ANNs and EC (GAs and GP), so if you include Neural Networks that utilize Evolutionary Computation techniques to modify weights or topology, then GP is being used to an extent. Replication as a variable in EC is the key force in biology, and I only see more use of EC techniques to enhance the general function approximators that are ANNs. Further, there are also hybridized computing machines that have been made, and are being made with FPGAs and GPUs. Finance and supercomputing are just two areas that are looking to utilize them. In some, the FPGAs are simply there for updating special computation programs that feed the GPUs. There is some research with a GP optimizer updating the FPGAs and then using the GPUs for the massive parallelization of the computations.


Thanks eggy, awesome replys. You should write some of your experiences down if you find the time.


Evolutionary algorithms and genetic programming are global optimization technique, basically random search with some memory. It's not "out of fashion" any more than simulated annealing or Monte Carlo methods. They have limited usability, that's all.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: