0% found this document useful (0 votes)
13 views5 pages

اسئلة دكتوراء 2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views5 pages

اسئلة دكتوراء 2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Q.

Consider a nonlinear dynamical system governed by the following


equations:
3
d x 1 /dt=−u , x 1 (0)=0

d x 2 /dt=x1 +u , x 2 (0)=0

u(t )≥ 0 , for all t ∈[0 ,1]

Let the admissible control u(t)∈ U, with:

 Case (a): u(t )≥ 0, unrestricted above,


 Case (b): u(t )∈[0 , 0.1]

The time horizon is fixed over t ∈[0,1]. The objective is to maximize


the final value of x 2 at t=1 . J (u)=−x 2 (t f =1)

(a) Formulate the performance index clearly in integral form.


(b) Define the Hamiltonian H using Pontryagin’s Minimum
Principle.
(c) Derive the necessary conditions for optimality, including
the costate equations
(d)Determine the optimal control law u∗(t) in terms of the

(e) Repeat the problem if the control is constrained by u ∈ [0, 0.1].


costate variables.

Solution:
We define the Hamiltonian
H = λ1(−u^3) + λ2(x1 + u)
Costate Equations:
dλ1/dt = −∂H/∂x1 = −λ2
dλ2/dt = −∂H/∂x2 = 0 → λ2(t) = constant = −1 (from terminal cost φ
= −x2(1))
Integrating dλ1/dt = −λ2 = 1: → λ1(t) = ∫1 dt = t + C → λ1(t) = −t
+d
Using terminal condition λ1(1) = 0: −1 + d = 0 → d = 1
Therefore: λ1(t) = −t + 1=1-t
Minimize H with respect to u: H = −(x 2 + u) − (t − 1)u^3
Taking derivative w.r.t u and setting to zero: dH/du = −1 − 3(t −
1)u^2 = 0
Solving gives:


u¿ (t)= 2
1
3(1−t )
> 0.1 for most t

Since u*(t) ≥ 1/3 for all t ∈ [0,1], and if u ∈ [0, 0.1], then constraint
is active:
ub*(t) = 0.1 for all t ∈ [0,1]
Q. Consider the control system
ẍ−2 ¿ ¿
a. Write the system in first-order state-space form.
b. Suppose u(t)= 0. Find and classify (using linearization) all
equilibria and
determine if they are stable or asymptotically stable if possible.
Discuss if the
stability results are global or local.
c. Show that Eq. (1) satisfies the periodic solution
x (t )=cos ( t ) , u ( t )=cos ⁡(2 t).
d. Design a state-feedback controller u = u (x, ẋ ) for (1), such that
the origin of
the closed loop system is globally asymptotically stable.
Q. The Euler equations for a rotating rigid spacecraft are given by
J 1 ω̇1=(J 2−J 3)ω 2 ω 3 +u1
J 2 ω̇2=(J 3−J 1)ω 3 ω 1+u 2
j 3 ω̇ 3=(J 1−J 2)ω1 ω2 +u3
Where ω i are the components of the angular velocity vector ω along
the principal axes, Ji are the corresponding moment of inertia, and
ui are control torques applied along the principal axes.
a. Assume u1 = u2 = u3 = 0, show that the origin ω 1 = ω 2 = ω 3 = 0
is stable. Is
it asymptotically stable? Hint: Use a quadratic Lyapunov Function
candidate of the form j 1 ω21 + j 2 ω 22+ j 3 ω23.
b. Suppose that toque feedback is applied according to ui = −ki ω i ,
where all
ki > 0. Show that the origin is globally asymptotically stable.
Q. Neglecting air resistance and the curvature of the earth the
launching of a
satellite is described with the following equations
ẋ 1=x 3
ẋ 2=x 4
F
ẋ 3= cos u
m
ẋ 4 =Fm sinu−g
Here x1 is the horizontal and x2 the vertical coordinate and x3 and x4
are the
corresponding velocities. The signal u is the controlled angle. The
criterion
is to maximize 0.1x1+ x2+5x3+3x4 at the end point. Show that the
optimal
¿ At + B
control signal has the form tanu =
Ct + D
and determine A, B, C, D.

You might also like