0% found this document useful (0 votes)
44 views

Multiple-Choice Test Chapter 09.04 Multidimensional Gradient Method Optimization

The document contains 6 multiple choice questions regarding optimization methods and concepts such as gradient descent. Specifically: 1. It asks which statement about using the second derivative to determine an optimal point is incorrect. 2. It asks which statement about using the steepest ascent method with an initial estimate is correct. 3. It asks about the gradient and Hessian of a function at its global optimum. 4. It asks about calculating the gradient of a function at a specific point. 5. It asks about calculating the determinant of the Hessian of a function at a specific point. 6. It provides an example optimization problem and asks you to conduct one iteration of gradient descent to minimize the function starting

Uploaded by

Varisa Rahmawati
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views

Multiple-Choice Test Chapter 09.04 Multidimensional Gradient Method Optimization

The document contains 6 multiple choice questions regarding optimization methods and concepts such as gradient descent. Specifically: 1. It asks which statement about using the second derivative to determine an optimal point is incorrect. 2. It asks which statement about using the steepest ascent method with an initial estimate is correct. 3. It asks about the gradient and Hessian of a function at its global optimum. 4. It asks about calculating the gradient of a function at a specific point. 5. It asks about calculating the determinant of the Hessian of a function at a specific point. 6. It provides an example optimization problem and asks you to conduct one iteration of gradient descent to minimize the function starting

Uploaded by

Varisa Rahmawati
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Multiple-Choice Test

Chapter 09.04 Multidimensional Gradient Method


Optimization

COMPLETE SOLUTION SET

1. Which of the following isincorrect?


(A) Direct search methods are useful when the optimization function is not differentiable
(B) The gradient of f(x,y) is the a vector pointing in the direction of the steepest slope at
that point.
(C) The Hessian is the Jacobian Matrix of second-order partial derivatives of a function.
(D) The second derivative of the optimization function is used to determine if we have
reached an optimal point.

Solution
The correct answer is (D).

The statement “The second derivative of the optimization function is used to determine if we
have reached an optimal point” is incorrect. The second derivative tells us if we the optimum we
have reached is a maximum or minimum.
2. An initial estimate of an optimal solution is given to be used in conjunction with the
steepest ascent method to determine the maximum of the function. Which of the
following statements is correct?
(A) The function to be optimized must be differentiable.
(B) If the initial estimate is different than the optimal solution, then the magnitude of the
gradient is nonzero.
(C) As more iterations are performed, the function values of the solutions at the end of
each subsequent iteration must be increasing.
(D) All 3 statements are correct.

Solution
The correct answer is (D).
3. What are the gradient and the determinant of the Hessian of the function f ( x, y ) = x 2 y 2 at
its global optimum?
(A) ∇f = 0i + 0 j and H > 0
(B) ∇f = 0i + 0 j and H = 0
(C) ∇f = 1i + 1 j and H < 0
(D) ∇f = 1i + 1 j and H = 0

Solution
The correct answer is (A).

When the global optimum is reached, travel in any direction would increase/decrease the
function value, therefore the magnitude of the gradient must be 0. At any optimum, the hessian
must be positive, therefore the correct answer is A.
4. Determine the gradient of the function x 2 − 2 y 2 − 4 y + 6 at point (0, 0)?
(A) ∇f = 2i − 4 j
(B) ∇f = 0i − 4j
(C) ∇f = 0i + 0 j
(D) ∇f = −4i − 4j

Solution
The correct answer is (B).
At point (0,0 ) , we calculate the gradient at this point as
∂f
= 2 x = 2(0) = 0
∂x
∂f
= −4 y − 4 = −4(0) − 4 = −4
∂y
which are used to determine the gradient as
∇f = 0i − 4 j
5. Determine the determinant of hessian of the function x 2 − 2 y 2 − 4 y + 6 at point (0, 0)?
(A) 2
(B) -4
(C) 0
(D) -8

Solution
The correct answer is (D).

To determine the Hessian, the second partial derivatives are determined and evaluated as follows
∂2 f
=2
∂x 2
∂2 f
= −4
∂y 2
∂2 f
=0
∂y∂x
The resulting Hessian matrix and its determinant are
2 0 
H =  H = (2)(−4) − 0 2 = −8
0 − 4
6. Determine the minimum of the function f ( x, y ) = x 2 + y 2 ? Use the point (2, 1) as the
initial estimate of the optimal solution. Conduct one iteration.
(A) (2,1)
(B) (−6,−3)
(C) (0,0)
(D) (1,−1)

Solution
The correct answer is (C).
To calculate the gradient; the partial derivatives must be evaluated as
∂f
= 2 x = 2(2) = 4
∂x
∂f
= 2 y = 2(1) = 2
∂y
which are used to determine the gradient at point (2,1) as
∇f = 4i + 2 j
Now the function f (x, y ) can be expressed along the direction of gradient as
 ∂f ∂f 
f  x0 + h, y 0 + h  = f (2 + 4h,1 + 2h) = (2 + 4h) 2 + (1 + 2h) 2
 ∂x ∂y 
Multiplying out the terms we obtain the one dimensional function along the gradient as
g (h) = 20h 2 + 20h + 5
* 1
This is a simple function and it is easy to determine h = − by taking the first derivative and
2
1
solving for its roots. This means that traveling a step size of h = − along the gradient reaches a
2
minimum value for the function in this direction. These values are substituted back to calculate a
new value for x and y as follows:
x = 2 + 4(-0.5) = 0
y = 1 + 2(0.5) = 0
Calculating the new values of x and y concludes the first iteration. Note that f (0,0 ) = 0 is less
than f (2,1) = 5 which indicates a move in the right direction. This point is also the optimal
solution.

You might also like