Self-adaptive Evolutionary Algorithms
Per Kristian Lehre(This page describes results in the paper Duc-Cuong Dang, Per Kristian Lehre, Self-adaptation of Mutation Rates in Non-elitist Populations, In Proceedings of Parallel Problem Solving from Nature - PPSN XIV, Volume 9921 of the series Lecture Notes in Computer Science pp 803-813. DOI 10.1007/978-3-319-45823-6_75, arXiv:1606.05551 . )
Jump to simulation.Introduction
Evolutionary algorithms are robust, nature-inspired optimisation algorithms. But these algorithms come with many parameters, such as the mutation rate. Finding appropriate settings for these parameters via trial and error is often time-consuming and costly. Is it possible to get rid of these parameters by making the algorithm evolve its own parameter settings? This may appear as a far-fetched idea. However, it is interesting to note that genes modifying the rate of mutation occur in real biological organisms, such as the E.coli bacteria (see e.g., LeClerc et al, Science 1996).
In a paper co-authored with Duc-Cuong Dang, and presented at the PPSN 2016 conference in Edinburgh, we prove that an evolutionary algorithm which self-adapts its mutation rate can be exponentially faster than static evolutionary algorithms. The time complexity analysis which utilises the so-called level-based theorem is quite involved. This page, therefore, provides an interactive visualisation of the ideas and intuition behind the analysis. The contribution of our paper is not the idea of self-adaptation, but the mathematical proof that it can be beneficial.
Description of Algorithm
The self-adaptive evolutionary algorithm has a population of individuals. In every generation, a new population of individuals is produced, and the old generation of individuals dies. Each new individual is produced in the same way. The figure below outlines how the algorithm produces a new individual from an existing population of individuals. Individuals reproduce asexually, akin to bacteria. The chromosome of every individual carries an additional gene that encodes a mutation rate. The best of two uniformly selected individuals is selected (i.e.,., tournament selection). The selected parent individual mutates its mutation rate with a fixed probability p. The parent is then replicating, where mutations occur at the new mutation rate. Although the selection mechanism does not take into account the mutation rate, the intuition is that appropriate mutation rates correlated with high fitness.
We assume that the algorithm has the choice between two mutation rates, a low mutation rate \(\chi_{\text{low}}\) and a higher mutation rate \(\chi_{\text{high}}\). (Please see the paper for the actual values of these parameters.) For comparison, we also consider algorithms that choose mutation rate \(\chi_{\text{low}}\) with probability 1 (always low mutation rate), with probability 0 (always high mutation rate), or with probability 1/2 (uniform mixing of mutation rates).
Runtime on Smooth Fitness Landscape
We first consider the performance of the self-adaptive evolutionary algorithm on the smooth fitness landscape defined below. In this landscape, the fitness of a bitstring of length \(n\) is the number of leading 1-bits in the bitstring. For example, the bitstring 1100 has fitness 2 and the bitstring 1111 has fitness 4.
The table below summarises the results of our analysis on the smooth fitness landscape. Both fixed low mutation and self-adaptation are efficient in this case.
Control | Runtime | Comment |
Fixed \(\chi_{\text{low}}\) | \( \mathcal{O}(n^2)\) | Appropriate balance between mutation rate and selective pressure, leading to polynomial runtime. |
Fixed \(\chi_{\text{high}}\) | \(e^{\Omega(n)}\) | Mutation rate is too high wrt selective pressure. Runtime is exponential. |
Uniform mixing | \(e^{\Omega(n)}\) | Effective mutation rate is too high wrt selective pressure. Runtime is exponential. |
Self-adaptation | \(\mathcal{O}(n^2)\) | Individuals with too high mutation rates vanish from the population. Remaining individuals optimise problem. Runtime is quadratic. |
Runtime on Peaked Fitness Landscape
The peaked fitness landscape is identical to the smooth landscape, except that the chromosome consisting of all 0-bits has fitness \(m\), where \(1\lt m\lt n\). We call the all 0-bitstring the peak in the fitness landscape. All the individuals in the initial population are placed on the peak, and the population must escape the peak to find the optimal search point.
The theoretical results in the table below show that the self-adaptation mechanism has quadratic runtime, while the other mechanisms are exponential (inefficient). Interestingly, the algorithm is inefficient both for fixed mutation rate \(\chi_{\text{low}}\), and for fixed mutation rate \(\chi_{\text{high}}\).
Control | Runtime | Comment |
Fixed \(\chi_{\text{low}}\) | \(e^{\Omega(n)}\) | Most individuals remain on the peak. Too low selective pressure among sub-optimal individuals then leads to exponential runtime. |
Fixed \(\chi_{\text{high}}\) | \(e^{\Omega(n)}\) | Most individuals fall off the peak, but mutation rate is too high wrt selective pressure to reach the optimum in polynomial time. |
Uniform mixing | \(e^{\Omega(n)}\) | Most individuals fall off the peak, but the effective mutation rate is too high wrt selective pressure, leading to exponential runtime. |
Self-adaptation | \(\mathcal{O}(n^2)\) | Most individuals fall off the peak. Peak individuals do not dominate. A sub-population surviving off the peak switches to the low mutation rate, thus ensuring polynomial runtime. |
Simulation
Mutation rate | Fitness landscape | Plot |