next up previous index
Next: Experimental Comparison Up: Minimization -- Global Methods Previous: Annealing Labeling for MAP-MRF Matching

9.4

Genetic Algorithms

 

Genetic algorithms (GAs) [Holland,Goldberg 1989] are an emerging class of heuristic procedures for global optimization. Inspired by the principle of natural evolution in the biological world, the procedures simulate the evolutionary process in which a population of individuals who have highest goodness-of-fit values survive.

  
Figure 9.6: A standard genetic algorithm.

Fig.9.6 illustrates a standard genetic algorithm for unconstrained, combinatorial optimization (Special coding of solutions into chromosomes needs to be done when the GA is used for (piecewise) continuous optimization problems.). Initially, a number of n (say, n=50) individuals are generated, yielding a population P. An individual is determined by his chromosome f which is a string of m genes (labels) ().

At a point of the evolution, two individuals are randomly selected for mating. The selection is done according to a scheme which favors the fitter individuals (with lower E value); the lower the value of , the more likely f will be selected. For example, f may be selected with a probability proportional to or a monotonically increasing function of it, assuming is non-negative. There are numerous selection schemes [Goldberg 1989].

Recombination takes place between the two selected individuals. The mechanisms of crossover and mutation are typically used for this. Fig.9.7 illustrates such two basic operations in which a gene can take any alphanumeric value. Crossover takes two individuals, cuts their chromosome strings at some randomly chosen position, recombines the opposite segments to create two offsprings. Crossover is not always invoked; it is applied with a probability typically between 0.6 and 1. If it is not applied, the offsprings are simply the duplicates of the selected individuals. Mutation is applied to each offspring after the crossover. It randomly alters a randomly chosen gene with a small probability , typically 0.001. As usual in GA practice, there are many crossover and mutation operators.

  
Figure: The two basic operations in recombination: crossover and mutation.

The offsprings are then added to P and the two least fit individuals, i.e. those with the highest values, are removed from the population P. As the evolution continuous in this way, the fitness of the best individual as well as the average fitness increases towards the global optimum. Convergence is judged by uniformity: A gene is said to have converged when most (say 95%) of the population share the same value. The population is said to have converged when all the genes have converged. The convergence of GAs is obtained but no theoretical proof is given.

Impressive empirical results for solving real and combinatorial, unconstrained and constrained optimization problems, such as traveling salesman problem, neural network optimization and scheduling, are reported (see [Goldberg 1989] for a review). Applications in computer vision are also seen [Bhanu et al. 1989,Ankenbrandt et al. 1990,Hill and Taylor 1992]. Currently, there is no theory which explains why GAs work. However, some hypotheses exist. Of these are the schema theorem [Holland 1975] and building block hypothesis [Goldberg 1989].