Björn Kindler

Introduction to Evolutionary Algorithms

I have called this principle, by which each slight variation, if useful, is preserved, by the term Natural Selection. Charles Darwin, "The Origin of Species"

The following paragraphs are meant to give a short introduction to the general functionality of evolutionary algorithms. Note that they cannot provide an exhaustive discussion of this very interesting topic. For more information on evolutionary and genetic algorithms please check the related section. Also, you should definitely consider checking the Evolvable Hardware project of our group later on to learn more about another cool application of evolutionary algorithms.

An evolutionary algorithm processes a whole group of individuals, the so-called population. In our case, each individual is a neural network that is to be used for a given application. Within the population, all individuals are represented by their genome which is simply a data structure containing all parameters of the respective individual that have to be tuned by the evolutionary algorithm. For a neural network with a fixed structure, the genome would typically contain the values of all synaptic weights.

The information about an individual can be stored in various ways. Usually, the parameters are arranged in a fixed order to form a linear string. Sometimes, the parameters are distributed among several such strings which are then referred to as chromosomes. The smallest part of the genome is called gene and in our case, a gene represents one weight of the network.

Each individual can be assigned a fitness value. This value serves as a quantitative measure of the performance of this individual on the task in question. For neural networks, the fitness reflects the correctness of the network's response to the training data. During the execution of the evolutionary algorithm, the individuals in the population are repeatedly subject to a selection procedure which models the principles of natural evolution. During the selection step, individuals with low fitness are discarded from the population while those with high fitness are allowed to produce offspring to fill the free space in the population.

The offspring is produced by making copies of the two mating individuals' genomes, splitting them at random points and exchanging some of their parts. This crossover step yields two genomes that contain the characteristics of the parents in a new combination. An additional source of variation can be introduced in the form of mutations: Single genes of a genome are randomly changed with a small probability.

The evolutionary algorithm starts off with the creation of an initial population, which is usually done by generating individuals with random genes. This first generation is then evaluated to get the fitness of each individual. In our case, this is done by implementing the networks on the chip and applying the training data of the task in question. The first generation is expected to show only a poor performance but - nevertheless - some of the networks are better than others. Using the individual fitness values, the selection scheme together with the crossover and mutation operators produce a new generation. Some of the individuals remain unchanged while the bad ones are replaced by the em>offspring</em> of the better ones. The size of the population is usually kept constant and is typically in the order of 20.

Starting with the fitness evaluation of the new individuals, the algorithm repeats the above steps to continually create new generations of networks from the previous ones. With every new generation, the average fitness of the individuals in the population - i.e., the performance of the networks - gets better and better. At the best, one of the individuals eventually evolves to completely fulfill the requirements of the investigated task.

Even if a specific problem can not be solved completely, evolutionary algorithms are valued for their ability to find the best possible solutions where other optimization algorithms get stuck with suboptimal results. As our network chip allows to implement and test a neural network within milliseconds, the simulated evolution does not need billions of years to yield the desired results but can successfully solve problems within minutes or even seconds.