Introduction to Optimization
Introduction to Optimization
B T G J/h
Input
Fuel Electric Power
(Input) (Output) MW
Pmin Pmax
Optimization Techniques and Thermal generating units
Output
Economic Dispatch Consider the running costs only
Input / Output curve
Fuel vs. electric power
Fuel consumption measured by its energy content
Upper and lower limit on output of the generating unit
3
$/h
Several generating units serving the load
What share of the load should each generating unit
produce?
Cost
Consider the limits of the generating units
Ignore the limits of the network
No-load cost
MW
Pmin Pmax
2 Output 4
1 2
Incremental Cost Curve Objective
Incremental cost curve Cost [$/h] Most engineering activities have an objective:
Achieve the best possible design
Achieve the most economical operating conditions
FuelCost vs Power
Power This objective is usually quantifiable
∆F
Examples:
Derivative of the cost curve ∆P
MW minimize cost of building a transformer
In $/MWh minimize cost of supplying power
Cost of the next MWh Incremental Cost minimize losses in a power system
[$/MWh] maximize profit from a bidding strategy
MW
5 7
6 8
3 4
Optimization Problem Minimization and Maximization
What value should the decision variables take so f(x*)
f(x)
that
F f x1 , x2 , x3 , .. xn
is minimum or maximum?
x* x
-f(x)
-f(x*)
10 12
5 6
Necessary Condition for Optimality Example
f(x)
f(x*) f(x)
df df
0 0
dx dx
x* x
x
If x x maximises f ( x ) then:
*
df df
f ( x ) f ( x * ) for x x * 0 for x x * For what values of x is 0 ?
dx dx
df In other words, for what values of x is the necessary
f ( x ) f ( x * ) for x x * 0 for x x *
13 dx condition
15
for optimality satisfied?
x* x
A B C D x
7 8
How can we distinguish minima and maxima? How can we distinguish minima and maxima?
f(x) f(x)
A B C D x A B C D x
d2 f d2 f
For x A and x D, we have: 2
0 For x C , we have: 0
dx dx 2
The objective function is concave around a maximum The objective function is flat around an inflexion point
17 19
How can we distinguish minima and maxima? Necessary and Sufficient Conditions of Optimality
Necessary condition:
df
f(x) 0
Sufficient condition: dx
For a maximum:
d2 f
0
For a minimum: dx 2
d2 f
0
A B C D x dx 2
d2 f
For x B we have: 0
dx 2
The objective function is convex around a minimum
18 20
9 10
Isn’t all this obvious? Example
21 23
22 24
11 12
Feasible Set Two-Dimensional Case
f(x1,x2)
f(x)
xMIN A D xMAX x
x1*
x1
Feasible Set x2*
The values of the objective function outside f(x1,x2) is minimum for x1*, x2*
the feasible set do not matter
25 x2 27
f ( x 1 ,x 2 )
0
x 1
x * ,x *
1 2
f ( x 1 ,x 2 )
0
x 2
x * ,x *
1 2
x1*
xMIN A B D E xMAX x
A and D are interior maxima x1
B and E are interior minima x2*
XMIN is a boundary minimum Do not satisfy the
XMAX is a boundary maximum Optimality conditions!
26 x2 28
13 14
Multi-Dimensional Case Sufficient Conditions for Optimality
f(x1,x2)
At a maximum or minimum value of f x1 , x2 , x3 , .. xn
Saddle point
we must have: f
0
x 1
f
0
x 2
f
0
x
n x1
29 x2 31
2 f 2 f 2 f
x 2
x 1 x 2 x 1 x n
1
2 f 2 f 2 f
x 2 x 1 x 22 x 2 x n
2 f 2 f 2 f
x1 x x x n x 2
n 1 x n2
x2 30 32
15 16
Sufficient Conditions for Optimality Contours
Calculate the eigenvalues of the Hessian matrix at the A contour is the locus of all the point that give the same value
to the objective function
stationary point
If all the eigenvalues are greater or equal to zero:
x2
The matrix is positive semi-definite
The stationary point is a minimum
If all the eigenvalues are less or equal to zero:
The matrix is negative semi-definite
The stationary point is a maximum
If some or the eigenvalues are positive and other are negative:
The stationary point is a saddle point
x1
33 35 Minimum or maximum
Contours Example 1
f(x1,x2)
Minimise C x 12 4 x 22 2 x 1 x 2
F2
Necessary conditions for optimality:
F1 C
2 x1 2 x 2 0
x 1 x 1 0 is a stationary
point
C x 2 0
2 x 1 8 x 2 0
x 2
x1
F1 F2
x2 34 36
17 18
Example 1 Example 2
Sufficient conditions for optimality: Minimize C x12 3x22 2x1 x2
2 2
0 2 10 12 0
2 8
10 52 The stationary point
= 0 is a minimum
37
2 39
Example 1 Example 2
Sufficient conditions for optimality:
x2 2 C 2 C
x 2 x 1 x 2 2 2
Hessian Matrix: 2
1
C 2 C 2 6
C=9
C=4 x 2 x 1 x 2
2
C=1
2 2
0 2 4 8 0
x1
2 6
4 80
= 0 The stationary point
2
Minimum: C=0 is a saddle point
4 80
or = 0
2
38 40
19 20
Example 2 Number of Constraints
x2
N decision variables
C=0
C=9
M equality constraints
If M > N, the problems is over-constrained
C=4
C=1
There is usually no solution
Minimise
x1 , x 2 5 x1 x 2 0
f x1 , x2 ,.. , xn Objective function
subject to:
1 x1 , x2 ,.. , xn 0 Minimum
Equality constraints
m x1 , x2 ,.. , xn 0
x1
42 44 f x 1 , x 2 0.25 x 12 x 22
21 22
Example 2: Economic Dispatch Solution by substitution
x1 x2
Difficult
L
G1 G2 Usually impossible when constraints are non-
linear
C 1 a1 b1 x 12 Cost of running unit 1
Provides little or no insight into solution
C 2 a2 b2 x 2
2 Cost of running unit 2
Minimise C a 1 a 2 b1 x 12 b 2 x 22
45
Subject to: x 1 x 2 L 47
23 24
Properties of the Gradient Example 4
Each component of the gradient vector f ( x , y ) ax by f f3
indicates the rate of change of the function in f
that direction x a f f2 y
f f
f
The gradient indicates the direction in which a b
y
function of several variables increases most
rapidly f f1 f
The magnitude and direction of the gradient f
usually depend on the point considered
At each point, the gradient is perpendicular to
the contour of the function x
49 51
f
x 2 ax x1 ,x 2 5 x1 x 2
f
f
2 by y
y
f 0.25 x 12 x 22 6
B f 0.25 x 12 x 22 5
C
A
D
50 52
25 26
Lagrange multipliers Lagrange multipliers
The solution must be on the constraint
f To reduce the value of f, we must move
x in a direction opposite to the gradient
f 1
f x1 , x 2
f f
x 2
f x1 , x 2 6 f x1 , x 2 6
A
f x1 ,x 2 5 f f x1 ,x 2 5
?
f f
53 55
C
f
At the optimum, the gradient of the
function is parallel to the gradient B
of the constraint
54
56
27 28
Lagrange multipliers Example
At the optimum, we must have: f Minimise f x 1 , x 2 0.25 x 12 x 22 subject to x 1 , x 2 5 x 1 x 2 0
Which can be expressed as: f 0
f x 1 , x 2 , 0.25 x 12 x 22 5 x 1 x 2
In terms of the co-ordinates:
0
x 1 x 1
f x 1 , x 2 ,
0 0.5 x 1 0
x 2 x 2 x 1
The constraint must also be satisfied: x1 , x 2 0 x 1 , x 2 ,
2x2 0
x 2
is called the Lagrange multiplier
x 1 , x 2 ,
5 x1 x 2 0
57 59
x 1 , x 2 , f x 1 , x 2 , 1
0 5 x1 x 2 0 5 2 0
x 1 x 1 x 1 2
x 1 , x 2 , f 2
0
x 2 x 2 x 2
x1 4
x 1 , x 2 ,
58 x1 ,x 2 0 60
x2 1
29 30
Example Application to Economic Dispatch
Minimise f x 1 , x 2 0.25 x 12 x 22
x1 x2
minimise f x 1 , x 2 C 1 x 1 C 2 x 2
Subject to x 1 , x 2 5 x 1 x 2 0
x2 L
G1 G2
s.t . x 1 , x 2 L x 1 x 2 0
x1 , x 2 5 x1 x 2 0
x 1 , x 2 , C1 x 1 C 2 x 2 L x 1 x 2
dC 1
0
x 1 dx 1 dC 1 dC 2
f x1 , x 2 5 Minimum
1 dC 2 dx 1 dx 2
0
x 2 dx 2
62 64
x1 x2
31 32
Interpretation of this solution Physical interpretation
dC 1
: Cost of one more MW from unit 1
dC 1 dC 2 dx 1
dx 1 dx 2 dC 2
: Cost of one more MW from unit 2
dx 2
dC 1 dC 2
Suppose that
x1 dx 1 dx 2
x1* x2* x2
dC 1
- Decrease output of unit 1 by 1MW decrease in cost =
- dx 1
L
+
dC 2
Increase output of unit 2 by 1MW increase in cost =
If < 0, reduce λ dx 2
L x1* x2* If > 0, increase λ
dC 2 dC 1
65 Net change in cost =
67 0
dx 2 dx 1
33 34
Generalization
Minimise
f x1 , x2 ,.. , xn
subject to:
1 x1 , x2 ,.. , xn 0
m x1 , x2 ,.. , xn 0
Lagrangian:
69
Optimality conditions
= f x1 ,.. , xn 1 1 x1 ,.. , xn m m x1 ,.. , xn
f 1 m
1 m 0
x 1 x 1 x 1 x 1
n equations
f 1 m
1 m 0
x n x n x n x n
1 x 1 ,, x n 0
1
m equations
m x 1 ,, x n 0 n + m equations in
m
70
n + m variables
35