0% found this document useful (0 votes)
110 views13 pages

Lecture 3-Hessian Matrix and Conditions For Max and Min

The document discusses second order derivatives of multivariable functions and conditions for local maxima and minima. It provides examples of calculating Hessian matrices and classifying critical points as local minima, maxima or saddles based on the definiteness of the Hessian matrix.

Uploaded by

Shady Ahmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
110 views13 pages

Lecture 3-Hessian Matrix and Conditions For Max and Min

The document discusses second order derivatives of multivariable functions and conditions for local maxima and minima. It provides examples of calculating Hessian matrices and classifying critical points as local minima, maxima or saddles based on the definiteness of the Hessian matrix.

Uploaded by

Shady Ahmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

AIS323

Spring 2023

Lecture 3
• Second order derivative of multivariable functions
• Conditions for local max and min

•Michel Bierlaire, Optimzation: Principles and algorithms, 2nd edition, EPFL Press, 2018.
•Mykel J. Kochenderfer Tim A. Wheeler, Algorithms for Optimization, MIT Press, 2019.
•S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004.
Differentiability
: Second order

• The Hessian matrix is the derivative of the gradient

• It gives information about the concavity (convexity) of the function.

2
Second Order
Differentiability: Consider the function
Examples

∇ 𝑓 ( 𝑥 , 𝑦 )= ( 4𝑥
18 𝑦 )
𝐻𝑒𝑠𝑠𝑖𝑎𝑛= ∇
2
𝑓 ( 𝑥 , 𝑦 )= ( 4
0
0
18 )
Second Order
Differentiability:
Examples
1. is positive definite.

Multidimensional –
dimensional
Optimization Problem: tells us that the function is not changing at .
Conditions for local is positive tells us that is in a bowl.
minima

5
Positive definite matrix:
Definitions
• All upper left determinants must be greater than zero. Determinant of all upper-left sub-
matrices must be positive. Break the
matrix in to several sub matrices, by
progressively taking upper-left
elements. If the determinants of all
the sub-matrices are positive, then
A the original matrix is positive
definite.

¿0

• Positive definite symmetric matrices have the property that all their eigenvalues are positive (>0).

• , for all other than the zero vector.

6
Negative definite matrix:
Definitions

• The determinantsdet(det() det()

¿0
¿0

• , for all other than the zero vector.

• Negative definite symmetric matrices have the property that all their eigenvalues are negative (<0).

7
Examples

Classify the following matrices as positive definite, negative definite, indefinite or unknown:
Second Order
Differentiability: Consider the function
Examples

∇ 𝑓 ( 𝑥 , 𝑦 )= ( 4𝑥
18 𝑦 )
𝐻𝑒𝑠𝑠𝑖𝑎𝑛= ∇
2
𝑓 ( 𝑥 , 𝑦 )= ( 4
0
0
18 )

Positive definite Hessian matrix


Multivariable optimization
problem: Analytical Solution
• Find all the stationary points; points satisfying

• Findthe Hessian matrix

• Check value at the stationary points, and classify them as min, max, saddle according to the following
theorem:
Two-dimensional function:
Simplified conditions for min, max and saddle points
Second Order =
Differentiability: Find the stationary points and state whether they are min, max, or saddle
Examples points

∇ 𝑓 =0 𝑎𝑡 (1 , 1)
The Hessian at (1, 1) is which is positive definite.

So (1,1) is a local min


Find the stationary points of the function , and state whether they are min,
max, or saddle points
Second Order
Differentiability:
Examples

H=

¿ 0 , <0 Indefinite

You might also like