The nonlinear constrained optimization interface in Evolutinary assumes that the user can write the optimization problem as follows:

\[\min_{x\in\mathbb{R}^n} f(x) \quad \text{such that}\\ l_x \leq \phantom{c(}x\phantom{)} \leq u_x \\ l_c \leq c(x) \leq u_c.\]

For equality constraints on $x_j$ or $c(x)_j$ you set those particular entries of bounds to be equal, $l_j=u_j$. Likewise, setting $l_j=-\infty$ or $u_j=\infty$ means that the constraint is unbounded from below or above respectively.

Following examples show how to use constraints to optimize the Booth function. The function is defined as follows:

julia> f(x)=(x[1]+2x[2]-7)^2+(2x[1]+x[2]-5)^2 # Booth
f (generic function with 1 method)

The function is usually evaluated on the square $x_i ∈ [-10, 10]$, for all $i = 1, 2$. The global minimum on this function is located at $(1,3)$.

ga = GA(populationSize=100,selection=uniformranking(3),
        mutation=gaussian(),crossover=uniformbin())
x0 = [0., 0.]
results = Evolutionary.optimize(f, x0, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [0.7925695998298106, 3.180850457362758]
    Minimum:    0.0785602322816691
    Iterations: 24

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0223 (vs limit Inf)
    Iterations:    24
    f(x) calls:    2500

Box Constrained Optimization

We want to optimize the Booth function in the box $0.5 \leq x_i \leq 2.0$, starting from the point $x_0=(1,1)$.

Reusing our Booth example from above, boxed minimization is performed by providing vectors of lower and upper bounds as follows:

lower = [0.5, 0.5]
upper = [2.0, 2.0]
x0 = [1., 1.]
results = Evolutionary.optimize(f, BoxConstraints(lower, upper), x0, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [1.8129031194565053, 2.0]
    Minimum:    1.8008324524585442
    Iterations: 21

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0004 (vs limit Inf)
    Iterations:    21
    f(x) calls:    2200

The box constraints can be also defined using BoxConstraints object.

Evolutionary.BoxConstraintsType

This type encodes box constraints for the optimization function parameters.

The constructor takes following arguments:

  • lower is the vector of value lower bounds
  • upper is the vector of value upper bounds

Note: Sizes of the lower and upper bound vectors must be equal.

source
cnst = BoxConstraints([0.5, 0.5], [2.0, 2.0])
Box Constraints:
    x[1]≥0.5, x[1]≤2.0, x[2]≥0.5, x[2]≤2.0

This object can be passed as a parameter to the optimization call, Evolutionary.optimize:

results = Evolutionary.optimize(f, cnst, x0, ga) # or Evolutionary.optimize(f, cnst, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [1.8233787092919858, 2.0]
    Minimum:    1.802732820240796
    Iterations: 26

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0005 (vs limit Inf)
    Iterations:    26
    f(x) calls:    2700

Penalty Constrained Optimization

For the penalty constrained optimization, any value and linear/nonlinear constraints are transformed into the penalty to the minimized fitness function. In order to provide linear/nonlinear constraints to an optimization problem, you can use the following penalty constraint algorithm:

Evolutionary.PenaltyConstraintsType

This type encodes constraints as the following additional penalty for an objective function:

$p(x) = \sum^n_{i=1} r_i max(0, g_i(x))^2$

where $r_i$ is a penalty value for $i$th constraint, and $g_i(x)$ is an inequality constraint. The equality constraints $h_i(x) = 0$ are transformed to inequality constraints as follows:

$h(x) - \epsilon \leq 0$

The constructor takes following arguments:

  • penalties: a vector of penalty values per constraint (optional)
  • lx: a vector of value lower bounds
  • ux: a vector of value upper bounds
  • lc: a vector of constrain function lower bounds
  • uc: a vector of constrain function upper bounds
  • c: a constraint function which returns a constrain values
source

We want to minimize the following function $f(x,y) = 3x+9y$ that is subject to constraints $\sqrt(xy) \geq 100$ and $x,y \geq 0$. The minimum of this function is near $(173.41, 57.8)$. We begin by defining constraints as follows:

# x, y ≥ 0
lx = [0.0, 0.0] # lower bound for values
ux = [Inf, Inf] # upper bound for values
# √xy ≥ 100
c(x) = [ prod(map(e->sqrt(e>0 ? e : 0.0), x)) ] # negative values are zeroed
lc   = [100.0] # lower bound for constraint function
uc   = [ Inf ]   # upper bound for constraint function
con = PenaltyConstraints(100.0, lx, ux, lc, uc, c) # first parameter is a penalty value
Penalty Constraints:
  Penalties: [100.0, 100.0, 100.0]
ConstraintBounds:
  Variables:
    x[1]≥0.0, x[2]≥0.0
  Linear/nonlinear constraints:
    c_1≥100.0

Now, we define the fitness function, an initial individual structure, and algorithm parameters; then we perform minimization as follows:

f(x) = 3x[1]+9x[2] # fitness function
x0 = [1., 1.] # individual
ga = GA(populationSize=100,selection=tournament(7),
        mutation=gaussian(),crossover=intermediate(2))
results = Evolutionary.optimize(f, con, x0, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [172.9716290671532, 57.752917943828706]
    Minimum:    1038.9608417924103
    Iterations: 97

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0058 (vs limit Inf)
    Iterations:    97
    f(x) calls:    9800

We can use worst fitness constraint algorithm which doesn't require to specify the constraint penalty value

Evolutionary.WorstFitnessConstraintsType

This type encodes constraints as the following additional penalty for an objective function:

$p(x) = f_{worst} + \sum^n_{i=1} |g_i(x)|$

if $x$ is not feasible, otherwise no penalty is applied.

The constructor takes following arguments:

  • lx: a vector of value lower bounds
  • ux: a vector of value upper bounds
  • lc: a vector of constrain function lower bounds
  • uc: a vector of constrain function upper bounds
  • c: a constraint function which returns a constrain values
source
con = WorstFitnessConstraints(lx, ux, lc, uc, c)
Worst Fitness ConstraintBounds:
  Variables:
    x[1]≥0.0, x[2]≥0.0
  Linear/nonlinear constraints:
    c_1≥100.0
results = Evolutionary.optimize(f, con, x0, ga)
 * Status: success

 * Candidate solution
    Minimizer:  [168.10650008313067, 59.48609998675682]
    Minimum:    1039.6944001302036
    Iterations: 57

 * Found with
    Algorithm: GA[P=100,x=0.8,μ=0.1,ɛ=0]

 * Convergence measures
    |f(x) - f(x')| = 0.0 ≤ 1.0e-12

 * Work counters
    Seconds run:   0.0131 (vs limit Inf)
    Iterations:    57
    f(x) calls:    5800

Auxiliary Functions

NLSolversBase.valueMethod
value(c::AbstractConstraints, x)

Return a values of constraints for a variable x given the set of constraints in the object c.

source
Evolutionary.isfeasibleFunction
isfeasible(bounds::ConstraintBounds, x) -> Bool

Return true if point x is feasible, given the bounds object with bounds lx, ux, lc, and uc. x is feasible if

lx[i] <= x[i] <= ux[i]
lc[i] <= c(x)[i] <= uc[i]

for all possible i.

source
isfeasible(c::AbstractConstraints, x) -> Bool

Return true if point x is feasible, given the constraints object c.

source
Evolutionary.penaltyFunction
penalty(constraints, x)

Calculate a penalty for the variable x given the set of constraints.

source
Evolutionary.penalty!Function
penalty!(fitness, constraints, population)

Set a penalty to the fitness given the set of constraints and population.

source
Evolutionary.apply!Function
apply!(c::AbstractConstraints, x)

Appliy constrains c to a variable x, and return the variable.

source