Lecture NLP 1+updated+ (Part+2) Print
Lecture NLP 1+updated+ (Part+2) Print
… (2)
H( f
)=
H(f )
1.7 CONSTRAINED PROBLEMS
1.7.1 Multivariable Optimization With
Equality Constraints
We consider the optimization of continuous functions
subjected to equality constraints:
Where:
The constraint in the original problem has now been eliminated, and
f(x2) is an unconstrained function with one independent variable.
We can now minimize the new objective function
by setting the first derivative of f equal to zero,
and solving for the optimal value of x2:
B3: = D3^2
C3: = E3^2
F4: =SUMPRODUCT(B4:C4;B3:C3)
F6: = SUMPRODUCT(D6:E6;D3:E3)
Example 2
The profit analysis model:
Max the profit z = v.p – cf – v.cv …….….. (1)
The demand is represented by: v = 1,500 – 24.6p ………... (2)
Where: v = volume (quantity), p = price,
cf = fixed cost = $10,000, cv = variable cost = $8 per unit.
Substituting values of cf and cv into (1), weobtain:
z = v.p -10,000 – 8v …..…… (3)
Substituting (2) in (3):
z = 1500p – 24.6p2 – 10,000 – 8(1,500 – 24.6p)
z = 1696.8p -24.6p2 -22,000 …..……. (4)
dz
= 1696.8p -49.2p = 0 for the critical points, then:
dp
p* = 34.49
d2z
= - 49.2 (negative), then p* is a local maximum.
dp 2
Substituting in (2): v* = 1500 – 24.6(34.49) = 651.55
Substituting in (3): zmax = (651.55)( 34.49) – 10,000 – 8(651.55) = 7259.56
1.7.1.2 Lagrange Method
1
Example
Find the solution of the following problem using the
Lagrange multiplier method:
f(x,y) = x-1y-2
Subject to: g(x,y) = x2 + y2 - 4 = 0
The Lagrange function is:
L(x,y,λ) =f(x,y) + λg(x,y) = x -1y -2 + λ(x2 + y2 - 4 )
The necessary conditions for the extreme of f(x, y) give:
&L 1
&x
= -x-2y-2 +2λx = 0 ……. (1) λ = x-3y -2……… (4)
2
&L
= -2x-1y-3 +2λy = 0 ……. (2) λ = x-1 y-4 ……… (5)
&y
&L
= x2 +y2 -4 = 0 ……. (3) From 4 , 5 : 1 x -3y -2 = x -1y -4
&λ 2
1 1 -2 1
x -3 y -2 = x-1y -4 y = x2y-4 y2 = x2
2 2 2
1 1 2
X* = y* Or: x2 = y ………… (6)
2 2
(3): x2 +y2 - 4 = 0
1 2
From (6): y2 +y2 = 4 y2 = 8 y* = 2
2 3 3
2
Substituting in (6): x* =
3
3 1 81
(5): λ = x -1 y -4 λ* = =9 3
2 16 16 128
The determinant:
0
-z 0
1
-z
=0
1
0
1.218 -z 0.344 2.309
1
R2 R2 + 8R1
-3[1(1)-2(0)] = -3(1) = -3
http://matrix.reshish.com/determinant.php
Necessary Conditions for a General Problem
,
This equation on expansion, leads to an (n — m)th-order polynomial
in z. If some of the roots of this polynomial are positive while the
others are negative, the point X* is not an extreme point.
1.7.2 Multivariable Optimization With
Inequality Constraints
Consider the following problem: Minimize f(X)
subject to: gj(X) ≤ 0, j = 1,2,. . .,m
Kuhn-Tucker Conditions
The conditions to be satisfied at a constrained minimum
point, X*. These conditions are, in general, not sufficient to
ensure a relative minimum. However, there is a class of
problems, called convex programming problems for which
the Kuhn-Tucker conditions are necessary and sufficient for
a global minimum.
Kuhn-Tucker Conditions
The Kuhn-Tucker conditions can be stated as follows:
34
Max: 28X1 + 21X2 + 0.25X22
Input Screen X1 + X2 ≤ 1000
0.5X1 + 0.4X2 ≤ 500
35
36
Solution
Solutionto
toGreat
GreatWestern Appliance’s
Appliance’s
Western NLPProblem
NLPProblem
using
using Excel
ExcelSolver:
Solver:
37
Both Nonlinear Objective Function and Nonlinear
Constraints.
39
AnExcel
An Excel Formulation
Formulation of Hospicare
of Hospicare Corp.’s
Corp.’s NLP
NLP Problem:
Problem:
8X1 – 2X2 ≤ 61
41
42
Linear Objective Function with Nonlinear
Constraints
Thermlock Corp. produces massive rubber washers and gaskets
like the type used to seal joints on the NASA Space Shuttles. To do
so, it combines two ingredients; rubber (X1) and oil (X2).
The cost of the industrial quality rubber used is $5 per pound and
the cost of the high viscosity oil is $7 per pound. Two of the three
constraints Thermlock faces are nonlinear. The firm’s objective
function and constraints are
44
Solutionto
Solution Thermlock’s
toThermlock’s NLP Problem
NLP Using
Problem Using
Excel Solver:
Excel Solver:
45
Computational Procedures -Nonlinear
Programming
Unlike LP methods:
• One disadvantage of NLP is that the solution procedures
to solve nonlinear problems do not always yield an
optimal solution in a finite number of steps. The solution
yielded may only be a local optimum, rather than a
global optimum. In other words, it may be an optimum
over a particular range, but not overall.
• There is no general method for solving all nonlinear
problems.
• Classical optimization techniques based on calculus, can
handle some special cases, usually simpler types of
problems.
46
Gradient method (steepest ascent method)
47
Separable programming
▪ Linear representation of nonlinear
problem.
▪ Separable programming deals with a
class of problems in which the
objective and constraints are
approximated by linear functions.
In this way, the powerful simplex
algorithm may again be applied.
48
In general, work in the area of
NLP is the least charted and
the most difficult of all the
quantitative analysis models.
49