So the Genetic Algorithm (GA, since I'm a lazy immediately-post-college student), is an optimization tool. It's capable of solving all kinds of really difficult problems. The GA is best used for problems where the best answer can't be found by traditional means in a reasonable amount of time. Thus, when using the Genetic Algorithm, it's best to be looking for a good-enough solution.

"Solutions," and why that's in bunny ears:

Imagine you're trying to get from LA to NY. There's a lot of ways you could get there, or "solutions" to your problem. Different airlines, different connections, first class vs. coach; the list goes on and on. If you ask someone how to get from LA to NY, you'll get a myriad of answers, so it's not a problem where you can't find a solution, it's a matter of whether or not that solution is very good. It's not a good plan to book a first class non-stop flight when you only have $45 to spend, and if you're on a deadline, you'll need to be there without a three-day layover in Cleveland. Especially if you're allergic to Cleveland. I heard about a guy once….

Anyways, there's a lot of different variables that go into choosing the best way for you to get to NY. Now imagine this as a graph theory problem, LA and NY are at different ends of the graph, and all the different cities you could stop in on your way there are points on the graph inbetween the start and destination. The edges in the graphs represent where you can go (probably completely interconnected; every point connected to every other point). Now imagine on each of these edges, there's a weight or cost (associate with airline ticket costs, for our example problem). Since the graph is fully connected, there's a lot of ways to get from LA to NY, but which is best (AKA cheapest)?

Graph Theory and Why It Sucks So Bad

Graph problems, like the one described above, are notorious for being a royal pain. If you were to do the math a little, you would notice that the number of paths increases at an alarming rate when you increase the number of points in the graph. These problems typically fall into a category of problems called NP-Complete. The exact definition of an NP-Complete problem is several pages long, but it goes something like this:

A problem is NP-Complete iff there does not exist a polynomial time algorithm to find the optimal solution (least cost path from above).

A lot of optimization problems (problems with more than one feasible solution, and possibly more than one optimal solution), also fall into this class, as well as a similar class called NP-hard, which basically means it's very similar to some NP-complete problem, but has yet to be proven NP-complete by the full definition. Typical proofs to prove an NP-hard problem to be NP-Complete involve reducing an NP-hard problem to a form of an already proven NP-Complete problem.

On to GA, already…

So, GA is a handy-dandy little tool for solving lots of problems like these. First, you randomly generate a bunch of "solutions" (which for the sake of my math profs, I'll now call feasible solutions), and call them your population. Then you'll rank each of these members of your population.

Ranking, Elitism at its Most Useful
The ranking system can be very simple, or very complicated. For our example above, the simple version would be to add the costs on the edges in our path from LA to NY. This is known as a single-objective GA. A more complicated version might include shortest travel time, in which case the GA is attempting to minimize ticket price as well as travel time. This is known as a multiple-objective GA (MOGA).

Now that we've ranked our population, we need to use this information to our advantage, much like the young Hitlers we all want to be. Now that we've assigned each member a value and put them in their proper place, it's time to start the breeding.

Mating, as dirty as your nerdy mind wants it to be

There's a lot of algorithms for mating, but they all essentially do the same thing: take two members of your population (parents), and make two more (children). As in biological sexual mating, the children will share the traits of both parents.

Depending on the type of problem, you can make your mating algorithms simpler or more complex, but there are some industry standards: simple crossover, which simply takes a random combination of genes from each of the parents; and blended crossover, which takes the two sets of genes from each parent and blends them with a random weight (which works well cause sometimes kids look more like their dad than their mom, which can really unfortunate for little Sally with the hairy back).

Matching, dating for bits

The key part to dating for a bit is to put your best foot forward. Unfortunately, since its been a long time since ram manufacturers have included the "foot" option (though SanDisk is getting back to it with their new MP3 players), that's not always as simple as it sounds. So, we take more simplistic, and unfortunate methods. Each of these algorithms has its benefits and drawbacks (both real and theoretical), but I won't address them here. If you read between the lines, you can probably determine my preferences. I'm not a subtle writer.

Random Pairing
I really don't feel the need to go too indepth into this. It's random. Deal with it.

Best-First
Again, kinda simplistic. Take the two best and mate them, the third and fourth, the fifth and sixth, etc. Every generation, you end up with a new set super-jocks and down at the bottom your unfortunate group that has epilepsy with a side of polio.

Tournament Pairing
Create a tournament bracket NCAA March Madness style. Say that each member has a certain percent chance of winning each round in its bracket, based on its rank compared to its opponent. Each member is assigned a random number between 0 and their percentage. Whichever of the members has the larger number moves on to the next round. The better valued members will always have a better chance of winning, but statistically, the lower members still have a shot at being that cinderalla team.

Plutonium in the Water
Now I don't actually remember if plutonium is one of those horribly feared elements that'll cause you to grow a third limb if you look at it crosseyed, but everyone knows what I'm talking about.

Everyone in their family has that red-headed kid that nobody knows where his red-hair came from and suspects the milk man cause he's always been a little off and never looks mom in the eye, but maybe mom's a standup lady and that's just a random mutation. It can happen. Really.

After you've finished mating, the idea is to go through and randomly mutate a few genes, just to keep things interesting. There's actually a lot of reasons for mutation which I will address at a later date, but this will do for now.

Lather, rinse, repeat.

Now that you've done all this, it is highly recommended that you do it again. And again. And again. Thousands of times, actually. Each of these iterations is called a generation, and the more generations (typically), the better your answer will get. This isn't entirely true, but again, this discussion goes beyond the scope of a first time reader.

Now that everyone has a good idea of what GA is all about, I can continue my fireside chats about my research. Until next time…

Advertisements