The nonlinear constrained optimization interface in Evolutinary assumes that the user can write the optimization problem as follows:

\[\min_{x\in\mathbb{R}^n} f(x) \quad \text{such that}\\ l_x \leq \phantom{c(}x\phantom{)} \leq u_x \\ l_c \leq c(x) \leq u_c.\]

For equality constraints on $x_j$ or $c(x)_j$ you set those particular entries of bounds to be equal, $l_j=u_j$. Likewise, setting $l_j=-\infty$ or $u_j=\infty$ means that the constraint is unbounded from below or above respectively.

Following examples show how to use constraints to optimize the Booth function. The function is defined as follows:

julia> f(x)=(x[1]+2x[2]-7)^2+(2x[1]+x[2]-5)^2 # Booth
f (generic function with 1 method)

The function is usually evaluated on the square $x_i ∈ [-10, 10]$, for all $i = 1, 2$. The global minimum on this function is located at $(1,3)$.

ga = GA(populationSize=100,selection=uniformranking(3),
        mutation=gaussian(),crossover=uniformbin())
x0 = [0., 0.]
results = Evolutionary.optimize(f, x0, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [1.0538422477953597, 2.967214315422337]
    Minimum:    0.0057474041790464255
    Iterations: 15

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0287 (vs limit Inf)
    Iterations:    15
    f(x) calls:    1600

Box Constrained Optimization

We want to optimize the Booth function in the box $0.5 \leq x_i \leq 2.0$, starting from the point $x_0=(1,1)$.

Reusing our Booth example from above, boxed minimization is performed by providing vectors of lower and upper bounds as follows:

lower = [0.5, 0.5]
upper = [2.0, 2.0]
x0 = [1., 1.]
results = Evolutionary.optimize(f, BoxConstraints(lower, upper), x0, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [1.8418533959148184, 2.0]
    Minimum:    1.8087585337480128
    Iterations: 15

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0004 (vs limit Inf)
    Iterations:    15
    f(x) calls:    1600

The box constraints can be also defined using BoxConstraints object.

Evolutionary.BoxConstraintsType

This type encodes box constraints for the optimization function parameters.

The constructor takes following arguments:

  • lower is the vector of value lower bounds
  • upper is the vector of value upper bounds

Note: Sizes of the lower and upper bound vectors must be equal.

source
cnst = BoxConstraints([0.5, 0.5], [2.0, 2.0])
Box Constraints:
    x[1]≥0.5, x[1]≤2.0, x[2]≥0.5, x[2]≤2.0

This object can be passed as a parameter to the optimization call, Evolutionary.optimize:

results = Evolutionary.optimize(f, cnst, x0, ga) # or Evolutionary.optimize(f, cnst, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [1.7891866151797442, 2.0]
    Minimum:    1.8005846464563546
    Iterations: 26

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0007 (vs limit Inf)
    Iterations:    26
    f(x) calls:    2700

Penalty Constrained Optimization

For the penalty constrained optimization, any value and linear/nonlinear constraints are transformed into the penalty to the minimized fitness function. In order to provide linear/nonlinear constraints to an optimization problem, you can use the following penalty constraint algorithm:

Evolutionary.PenaltyConstraintsType

This type encodes constraints as the following additional penalty for an objective function:

$p(x) = \sum^n_{i=1} r_i max(0, g_i(x))^2$

where $r_i$ is a penalty value for $i$th constraint, and $g_i(x)$ is an inequality constraint. The equality constraints $h_i(x) = 0$ are transformed to inequality constraints as follows:

$h(x) - \epsilon \leq 0$

The constructor takes following arguments:

  • penalties: a vector of penalty values per constraint (optional)
  • lx: a vector of value lower bounds
  • ux: a vector of value upper bounds
  • lc: a vector of constrain function lower bounds
  • uc: a vector of constrain function upper bounds
  • c: a constraint function which returns a constrain values
source

We want to minimize the following function $f(x,y) = 3x+9y$ that is subject to constraints $\sqrt(xy) \geq 100$ and $x,y \geq 0$. The minimum of this function is near $(173.41, 57.8)$. We begin by defining constraints as follows:

# x, y ≥ 0
lx = [0.0, 0.0] # lower bound for values
ux = [Inf, Inf] # upper bound for values
# √xy ≥ 100
c(x) = [ prod(map(e->sqrt(e>0 ? e : 0.0), x)) ] # negative values are zeroed
lc   = [100.0] # lower bound for constraint function
uc   = [ Inf ]   # upper bound for constraint function
con = PenaltyConstraints(100.0, lx, ux, lc, uc, c) # first parameter is a penalty value
Penalty Constraints:
  Penalties: [100.0, 100.0, 100.0]
ConstraintBounds:
  Variables:
    x[1]≥0.0, x[2]≥0.0
  Linear/nonlinear constraints:
    c_1≥100.0

Now, we define the fitness function, an initial individual structure, and algorithm parameters; then we perform minimization as follows:

f(x) = 3x[1]+9x[2] # fitness function
x0 = [1., 1.] # individual
ga = GA(populationSize=100,selection=tournament(7),
        mutation=gaussian(),crossover=intermediate(2))
results = Evolutionary.optimize(f, con, x0, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [173.47985008333677, 57.587233272031355]
    Minimum:    1038.963662904815
    Iterations: 44

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0033 (vs limit Inf)
    Iterations:    44
    f(x) calls:    4500

We can use worst fitness constraint algorithm which doesn't require to specify the constraint penalty value

Evolutionary.WorstFitnessConstraintsType

This type encodes constraints as the following additional penalty for an objective function:

$p(x) = f_{worst} + \sum^n_{i=1} |g_i(x)|$

if $x$ is not feasible, otherwise no penalty is applied.

The constructor takes following arguments:

  • lx: a vector of value lower bounds
  • ux: a vector of value upper bounds
  • lc: a vector of constrain function lower bounds
  • uc: a vector of constrain function upper bounds
  • c: a constraint function which returns a constrain values
source
con = WorstFitnessConstraints(lx, ux, lc, uc, c)
Worst Fitness ConstraintBounds:
  Variables:
    x[1]≥0.0, x[2]≥0.0
  Linear/nonlinear constraints:
    c_1≥100.0
results = Evolutionary.optimize(f, con, x0, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [160.51280295035465, 62.301818769745985]
    Minimum:    1042.254777778778
    Iterations: 62

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0043 (vs limit Inf)
    Iterations:    62
    f(x) calls:    6300

Auxiliary Functions

NLSolversBase.valueMethod
value(c::AbstractConstraints, x)

Return a values of constraints for a variable x given the set of constraints in the object c.

source
Evolutionary.isfeasibleFunction
isfeasible(bounds::ConstraintBounds, x) -> Bool

Return true if point x is feasible, given the bounds object with bounds lx, ux, lc, and uc. x is feasible if

lx[i] <= x[i] <= ux[i]
lc[i] <= c(x)[i] <= uc[i]

for all possible i.

source
isfeasible(c::AbstractConstraints, x) -> Bool

Return true if point x is feasible, given the constraints object c.

source
Evolutionary.penaltyFunction
penalty(constraints, x)

Calculate a penalty for the variable x given the set of constraints.

source
Evolutionary.penalty!Function
penalty!(fitness, constraints, population)

Set a penalty to the fitness given the set of constraints and population.

source
Evolutionary.apply!Function
apply!(c::AbstractConstraints, x)

Appliy constrains c to a variable x, and return the variable.

source