Selection

Selection is a genetic operator used in EAs for selecting potentially useful solutions from a population for later breeding. The EAs are stochastic search methods using the concepts of Mendelian genetics and Darwinian evolution. According to Darwin's evolution theory the best ones should survive and create new offspring. There are many methods how to select the best individuals, for example roulette wheel selection, Boltzman selection, tournament selection, rank selection, steady state selection and some others.

Selection Interface

All selection algorithms have following call interface selection(fitness, N) where fitness is the vector of population fitness values, of size $M$, and $N$ is the number of selected individuals. The selection function returns a vector of integer indexes of selected individuals, of size $N$ with indexes in range $[1,M]$.

Note: Some of the selection algorithms implemented as function closures, in order to provide additional parameters to the specified above selection interface.

Genetic Algorithm

Evolutionary.ranklinearFunction
ranklinear(sp::Real)

Returns a rank-based fitness selection function, see Selection Interface, with the selective pressure value sp.

In rank-based fitness selection, the population is sorted according to the objective values. The fitness assigned to each individual depends only on its position in the individuals rank and not on the actual objective value [1].

Consider $M$ the number of individuals in the population, $P$ the position of an individual in this population (least fit individual has $P = 1$, the fittest individual $P = M$) and $SP$ the selective pressure. The fitness value for an individual is calculated as:

$Fitness(P) = 2 - SP + \frac{2(SP - 1)(P - 1)}{(M - 1)}$

Linear ranking allows values of selective pressure in [1.0, 2.0].

source
Evolutionary.uniformrankingFunction
uniformranking(μ)

Returns a (μ, λ)-uniform ranking selection function, see Selection Interface with the best individuals parameter μ.

In uniform ranking, the best $\mu$ individuals are assigned a selection probability of $1/\mu$ while the rest them are discarded [2].

source
Evolutionary.rouletteFunction
roulette(fitness)

Roulette wheel (fitness proportionate, FPS) selection from fitness collection.

In roulette selection, the fitness level is used to associate a probability of selection with each individual. If $f_i$ is the fitness of individual $i$ in the population, its probability of being selected is $p_i = \frac{f_i}{\Sigma_{j=1}^{M} f_j}$, where $M$ is the number of individuals in the population.

Note: Best used in maximization context.

source
Evolutionary.rouletteinvFunction
rouletteinv(fitness)

Fitness proportionate selection (FPS) or roulette wheel for inverse fitness values. Best used in minimization to 0.

source
Evolutionary.susFunction
sus(fitness, N)

Stochastic universal sampling (SUS) provides zero bias and minimum spread [3]. SUS is a development of fitness proportionate selection (FPS). Using a comb-like ruler, SUS starts from a small random number, and chooses the next candidates from the rest of population remaining, not allowing the fittest members to saturate the candidate space. The individuals are mapped to contiguous segments of a line, such that each individual's segment is equal in size to its fitness exactly as in roulette-wheel selection. Here equally spaced pointers are placed over the line as many as there are individuals to be selected.

Consider $N$ the number of individuals to be selected, then the distance between the pointers are $1/N$ and the position of the first pointer is given by a randomly generated number in the range $[0, 1/N]$.

Note: Best used in maximization context.

source

Differential Evolution

Evolutionary.randomFunction
random(fitness, N)

Returns a collection on size N of uniformly selected individuals from the population.

source

References

• 1Baker J.E., Adaptive selection methods for genetic algorithms, In Proceedings of International Conference on Genetic Algorithms and Their Applications, pp. 100-111, 1985.
• 2Schwefel H.P., Evolution and Optimum Seeking, Wiley, New York, 1995.
• 3Baker, J. E., Reducing Bias and Inefficiency in the Selection Algorithm. In [ICGA2], pp. 14-21, 1987.