Hacker News new | past | comments | ask | show | jobs | submit login

A GA sounds like exactly the wrong way to go about this. I think you could do much better with a heuristic algorithm than a GA in this case, and I think GAs (like neural networks) are very much overused just because they correlate well with natural processes.

That said, I haven't played SC2, and I don't know what sort of complexity the building problem has.




I can't quite understand if you're saying GA's are neural networks, or just that both of them are overused. Either way, one of the big problems is that the heuristics for a heuristic algorithm must be designed by with human insights. In this case, and in many other cases where GA's have been applied successfully, a fundamental new insight is discovered by the algorithm.

I do play some SC2. There are a bunch of built-in AIs that use heuristic algorithms, yet they are all easily defeated. The GA in this case is really about searching through an immense space to find good candidates.

To give you a feel for the complexity of the building problem, consider this. There are custom maps where are all you do is practice your initial build order, with absolutely no opponent. It is pretty well accepted within the SC2 community that you should use these practice maps in order to get better. Another way to think about it is that this build order will probably cause Blizzard to tweak the game due to how fast the roaches come out. This game has been in development for 10 years, yet there are interactions of the building process that are not understood by the developers themselves.


> I can't quite understand if you're saying GA's are neural networks, or just that both of them are overused.

I was saying that both are overused.

> one of the big problems is that the heuristics for a heuristic algorithm must be designed by with human insights

By "heuristics" I don't strictly mean "rule-based", but you can do a good job in reducing the search space by a large amount if you program in some common sense that will discard obviously bad moves.

It would be surprising if the devs did understand all the interactions, as it would mean that the search space is trivial. SC2 is not that much different from chess, and, to my knowledge, the best chess-playing AIs don't use genetics. They get much further with trees and precalculated start- and endgame moves.


Indeed. Though the crucial difference is, that chess is a game of complete information, while StarCraft hides your enemy. It's much harder to do tree search without complete information.

As a comparison, look at the recent progress made in computer Go that employs Monte Carlo simulations. (And Genetic Algorithms are closer to Monte Carlo than to tree searches.)


I believe it's only harder when you want to optimise for the outcome (winning), since it's hard to know what your opponent will do if you can't see them. However, if you optimise for total offensive/defensive power, I think you can do quite well with trees!


Not only trees but also linear or integer programming approaches. (That's where my theoretical background lies, by the way.)


Chess has a much smaller board, turn-based play, and no unit unit production or races, so it's a bad comparison.


> By "heuristics" I don't strictly mean "rule-based", but you can do a good job in reducing the search space by a large amount if you program in some common sense that will discard obviously bad moves.

I believe that the fitness function used for evaluating build orders does this. It is very difficult to accurately evaluate how good a specific strategy is at a specific point in time, especially when you have imperfect information about your enemy. The code used to decide which build orders survive almost certainly used heuristics.


Yes, but that's the fitness function you want to maximise (i.e. the end result you will get with each "individual"). I'm saying that you might want to use heuristics in maximising that, rather than a GA.


This AI isn't intended to play the game however; by analogy with chess, this is intended to determine the best opening to use, when you have no knowledge of your opponent's moves. Hence min-max trees make no sense (since there are no "turns"), and we can't use precalculated moves, since those are what we're trying to figure out.


"one of the big problems is that the heuristics for a heuristic algorithm must be designed by with human insights."

Note that this isn't really true--the point of Lenat's Eurisko system referenced upthread was to have it come up with new heuristics and measure how effective they were.


That's what I was thinking too but my AI knowledge is a bit rusty. A genetic algorithm serves as a search algorithm that is finding a solution among a set of candidates. For example finding an answer to a sudoku. A GA can also be used to perform optimization. For example arranging the structure of a bridge to maximize the load it can support.

The Starcraft build order as defined here is a planning problem, that is a problem whose goal is to find a sequence of action to get to a desired state. There are algorithms more suited than GA because they take advantage of the logical structure of the problem.

Do I get that right?


I can't tell you if it's "right", as I haven't worked on this particular problem, and ML is hardly black-and-white. GAs might give good results, it's just that there is usually another algorithm that will give better results faster.

Again, it might work very well, I just think that some sort of tree-based approach might be faster and more efficient.


Again, it might work very well, I just think that some sort of tree-based approach might be faster and more efficient.

While this is true in general, sometimes the nice thing about genetic methods is that pretty much all you have to do is write the fitness function, cross your fingers that the problem is a good fit for the method and go do some real work on another computer for a while.

Oftentimes other methods require more in the way of setup or planning, not the least of which is actually picking an appropriate method and mapping its implementation onto the problem space.

Some of the nastiest problems I've ever worked on have gotten that way because I went with a domain specific approach that was "optimized for the problem"; while the end results are usually very good (and, to be fair, run on the computer from zero-to-solved in no time flat), in a couple cases I've gone back and tried the "brute force" genetic programming approach, and though it took quite a bit more computer time, the programming effort was substantially smaller to end up with similarly good results.

The real problem, though, is that many of the problems people apply GA/GP to aren't well suited to the genetic approach, usually because the problem itself doesn't lend itself to incrementally improving solutions (problems like function regression can be tricky, because getting close to the correct functional form symbolically usually leaves you very far away from the correct form numerically, and sometimes the correct result is actually surrounded by a "wall" of completely and utterly unfit solutions that's very hard to break through). Genetic methods work well when the fitness landscape has lots of solutions that work fairly well, and quite a few that work great, not when there's literally one needle in some super-multi-dimensional haystack.

To be fair, the Starcraft optimization problem is probably somewhere in the middle, there are probably many good solutions to the problem, and many of these will be minor variations on other good solutions, so it's pretty likely that a GA will get to some good ones. But the search space is small enough that you're probably right, a heuristic-guided direct search would be more likely to pick out the best solutions faster.


Yep, exactly right. I'm just going by the fact that they probably need to optimize for running time first of all, as this will need to run in real time (or so I assume). GAs would make this very hard, even if the results are good enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: