0% found this document useful (0 votes)
111 views

Assignment 4

This document provides instructions for an assignment involving the optimization of three test functions and a design project using gradient-based and heuristic optimization methods. Students are asked to: 1) Numerically optimize three test functions (Rosenbrock, Eggcrate, Golinski's speed reducer) using a gradient-based method and record results. 2) Analyze scaling of design variables or constraints based on Hessian diagonal entries to improve convergence for their design project from Assignment 3. 3) Apply a heuristic optimization technique (simulated annealing, genetic algorithm, particle swarm, Tabu search) to their design project and compare results to gradient method.

Uploaded by

Boba
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
111 views

Assignment 4

This document provides instructions for an assignment involving the optimization of three test functions and a design project using gradient-based and heuristic optimization methods. Students are asked to: 1) Numerically optimize three test functions (Rosenbrock, Eggcrate, Golinski's speed reducer) using a gradient-based method and record results. 2) Analyze scaling of design variables or constraints based on Hessian diagonal entries to improve convergence for their design project from Assignment 3. 3) Apply a heuristic optimization technique (simulated annealing, genetic algorithm, particle swarm, Tabu search) to their design project and compare results to gradient method.

Uploaded by

Boba
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

TEXAS A&M UNIVERSITY

MEEN 683 Multidisciplinary System Design Optimization (MSADO)


Spring 2021

Assignment 4, Part A
Please submit individually and include any code created for the assignment.
Topics: Gradient-based Optimization

Part (a)
Comparison of Optimization Algorithms
Consider the following three optimization problems:

The Rosenbrock Function


This function is known as the “banana function” because of its shape; it is described mathemati-
cally in Equation 1. In this problem, there are two design variables with lower and upper limits of
[−5, 5]. The Rosenbrock function has a known global minimum at [1, 1] with an optimal function
value of zero. 2
Minimize f (x) = 100 x2 − x12 + (1 − x1 )2 (1)
The Eggcrate Function
This function is described mathematically in Equation 2. In this problem, there are two design
variables with lower and upper bounds of [−2π, 2π]. The Eggcrate function has a known global
minimum at [0, 0] with an optimal function value of zero.

Minimize f (x) = x12 + x22 + 25 sin2 x1 + sin2 x2



(2)

Golinski’s Speed Reducer


This hypothetical problem represents the design of a simple gearbox such as might be used in
a light airplane between the engine and propeller to allow each to rotate at its most efficient
speed. The gearbox is depicted in Figure 1 and its seven design variables are labeled. The ob-
jective is to minimize the speed reducer’s weight while satisfying the 11 constraints imposed by
gear and shaft design practices. A full problem description can be found in: Ray, T., “Golin-
ski’s Speed Reducer Problem Revisited,” AIAA Journal, Vol. 41, No. 3, 2003, pp. 556-558. A
known feasible solution obtained by a sequential quadratic programming (SQP) approach (Mat-
lab’s fmincon) is a 2994.34 kg gearbox with the following values for the seven design variables:
[3.5000, 0.7000, 17, 7.3000, 7.7153, 3.3502, 5.2867]. This is a feasible solution with four active
constraints, but is it an optimal solution?

Your tasks: Numerically find the minimum (=optimal) feasible design vector x for each of the
above three problems using a gradient-based search technique of your choice (e.g., steepest de-
scent, SQP,...). For each run record the starting point you used, the iteration history (objective
Figure 1: Golinski’s speed reducer with 7 design variables

value on y-axis and iteration number on x-axis), the final point at which the algorithm terminated
and whether or not the final solution is feasible. Do 10 runs for each problem.

Discuss the results and insights you get from numerically solving these three nonlinear optimiza-
tion problems.
Part (b)
(b-1) Scaling

Use a gradient-based algorithm as in Assignment 3 for the question below (pick one algorithm).
The current optimal solution x∗ refers to the “optimal” solution in your project found using this
algorithm in Assignment 3.

(b-1.1) Compute the n diagonal entries of the Hessian at your current optimal solution, H(x∗ ). Use
finite differencing to evaluate these entries. (Note: if you have a large number of design variables,
you may do these steps for any 10 design variables.)

(b-1.2) If any of the entries computed in (b-1.1) are greater than 102 or less than 10− 2, then the cor-
responding design variable should be scaled. Note that H(i, i) ∼ 1/xi2 . Thus, if H(i, i) ∼ 10−4 , then
the appropriate scaling of the design variable xi is 10−2 xi . Compute the scaling required for each
design variable to make H(i, i) ∼ O(1). Consider only scalings of the form 10−2 , 10−1 , 101 , 102 ,
etc, (i.e., worry about magnitude only).

(b-1.3) Redefine your design variable using the scalings computed in (b-1.2) and re-run the op-
timizer, starting from the previous “optimal” solution x∗ . do you see any change in the optimal
solution?

Please note: If you do not have any continuous design variables in your problem, please instead
analyze the scaling of your constraints. Study whether scaling certain constraints in your problem
affects convergence or convergence rates.

(b-2) Heuristic Optimization

In Assignment 3 you used a gradient search technique to optimize the system for your design
project. In this assignment you are to apply a heuristic technique (SA, GA, PSO, or Tabu search)
to your design project.

(b-2.1) Describe which technique you chose and why.

(b-2.2) Attempt to optimize the system and compare the answers with the answers you received
with the gradient search technique.

(b-2.3) Tune the parameters of the algorithm and observe the differences in behavior. What ap-
pears to be the best algorithm tuning parameter settings for your project?

(b-2.4) How confident are you that you have found the true global optimum?

You might also like