0% found this document useful (0 votes)
96 views

04 Intro To LMI

This document provides an introduction to linear matrix inequalities (LMIs). It begins by defining LMIs as inequalities involving matrix variables that appear linearly and represent convex sets. It then discusses how LMIs can be written in standard form and discusses concepts like positive definiteness. Examples are provided to illustrate how polynomial constraints can be written as LMIs. The document also discusses how eigenvalue minimization and matrix norm minimization problems can be formulated and solved using LMIs.

Uploaded by

mamdouhkh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
96 views

04 Intro To LMI

This document provides an introduction to linear matrix inequalities (LMIs). It begins by defining LMIs as inequalities involving matrix variables that appear linearly and represent convex sets. It then discusses how LMIs can be written in standard form and discusses concepts like positive definiteness. Examples are provided to illustrate how polynomial constraints can be written as LMIs. The document also discusses how eigenvalue minimization and matrix norm minimization problems can be formulated and solved using LMIs.

Uploaded by

mamdouhkh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

An Introduction to Linear Matrix

Inequalities

Raktim Bhattacharya
Aerospace Engineering, Texas A&M University
Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Linear Matrix Inequalities


What are they?

Inequalities involving matrix variables


Matrix variables appear linearly
Represent convex sets polynomial inequalities

Critical tool in post-modern control theory

AERO 632, Instructor: Raktim Bhattacharya 2 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Standard Form

F (x) := F0 + x1 F1 + · · · + xn Fn > 0
where 
x1
 x2 
 
x :=  .  , Fi ∈ Sm m × m symmetric matrix
 .. 
xn
Think of F (x) : Rn 7→ Sm .
Example: [ ] [ ] [ ]
1 x 1 0 0 1
>0⇔ +x > 0.
x 1 0 1 1 0

AERO 632, Instructor: Raktim Bhattacharya 3 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Positive Definiteness
Matrix F > 0 represents positive definite matrix
F > 0 ⇐⇒ xT F x > 0, ∀x ̸= 0
F > 0 ⇐⇒ leading principal minors of F are positive
Let  
F11 F12 F13 · · ·
F21 F22 F23 · · · 
F =
F31 F32 F33 · · ·


··· ··· ··· ···

n Polynomial Constraints as a Linear Matrix Inequality



F11 F12 F13
F11 F12
F > 0 ⇐⇒ F11 > 0, > 0, F21 F22 F23 > 0, · · ·
F21 F22
F31 F32 F33

AERO 632, Instructor: Raktim Bhattacharya 4 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Definiteness
Positive Semi-Definite
F ≥ 0 ⇐⇒ iff all principal minors are ≥ 0 not just leading

Negative Definite
F < 0 ⇐⇒ iff every odd leading principal minor is < 0 and even
leading principal minor is > 0 they alternate signs, starting with < 0
Negative Semi-Definite
F ≤ 0 ⇐⇒ iff every odd principal minor is ≤ 0 and even principal
minor is ≥ 0

F >0 ⇐⇒ −F < 0
F ≥0 ⇐⇒ −F ≤ 0

Matrix Analysis, Roger Horn.

AERO 632, Instructor: Raktim Bhattacharya 5 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Example 1

[ ]
y x
y > 0, y − x2 > 0, ⇐⇒ >0
x 1

] [
y x
LMI written as > 0 is in general form.
x 1
We can write it in standard form as
[ ] [ ] [ ]
0 0 1 0 0 1
+y +x >0
0 1 0 0 1 0

General form saves notations, may lead to more efficient


computation

AERO 632, Instructor: Raktim Bhattacharya 6 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Example 2
 
1 0 x1
x21 + x22 < 1 ⇐⇒  0 1 x2  > 0
x1 x2 1
Leading Minors are

1 > 0
1 0

0 1 > 0

1 x2 1 x1 0 1

1 − 0
+ x1 > 0
x2 1 x1 1 x1 x2
Last inequality simplifies to
1 − (x21 + x22 ) > 0

AERO 632, Instructor: Raktim Bhattacharya 7 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Eigenvalue Minimization
Let Ai ∈ Sn , i = 0, 1, · · · , n.

Let A(x) := A0 + A1 x1 + · · · + An xn .

Find x := [x1 x2 · · · xn ]T

that minimizes

J(x) := min λmax A(x).


x

How to solve this problem?

AERO 632, Instructor: Raktim Bhattacharya 8 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Eigenvalue Minimization (contd.)


Recall for M ∈ Sn

λmax M ≤ t ⇐⇒ M − tI ≤ 0.

Linear algebra result: Matrix Analysis – R.Horn, C.R. Johnson

Optimization problem is therefore

min t
x,t

such that A(x) − tI ≤ 0.

AERO 632, Instructor: Raktim Bhattacharya 9 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Matrix Norm Minimization


Let Ai ∈ Rn , i = 0, 1, · · · , n.

Let A(x) := A0 + A1 x1 + · · · + An xn .

Find x := [x1 x2 · · · xn ]T

that minimizes
J(x) := min ∥A(x)∥2 .
x

How to solve this problem?

AERO 632, Instructor: Raktim Bhattacharya 10 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Matrix Norm Minimization


contd.

Recall
∥A∥2 := λmax AT A.

Implies
min t2
t,x

A(x)T A(x) − t2 I ≤ 0.
or
Optimization problem is therefore
[ ]
tI A(x)
min t2 subject to ≥ 0.
t,x A(x)T tI

AERO 632, Instructor: Raktim Bhattacharya 11 / 38


Important Inequalities
Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Generalized Square Inequalities


Lemma For arbitrary scalar x, y, and δ > 0, we have
( )
√ y 2 1
δx − √ = δx2 + y 2 − 2xy ≥ 0.
δ δ

Implies
1
2xy ≤ δx2 + y 2 .
δ

AERO 632, Instructor: Raktim Bhattacharya 13 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Generalized Square Inequalities


Restriction-Free Inequalities

Lemma Let X, Y ∈ Rm×n , F ∈ Sm , F > 0, and δ > 0 be a scalar,


then
X T F Y + Y T F X ≤ δX T F X + δ −1 Y T F Y .
When X = x and Y = y

2xT F y ≤ δxT F x + δ −1 y T F y.

Proof: Using completion of squares.


(√ √ )T (√ √ )
δX − δ −1 Y F δX − δ −1 Y ≥ 0.

AERO 632, Instructor: Raktim Bhattacharya 14 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Generalized Square Inequalities


Inequalities with Restrictions

Let
F = {F | F ∈ Rn×n , F T F ≤ I}.

Lemma Let X ∈ Rm×n , Y ∈ Rn×m , then for arbitrary δ > 0

XF Y + Y T F T X T ≤ δXX T + δ −1 Y T Y , ∀F ∈ F.

Proof: Approach 1: Using completion of squares.


Start with
(√ √ )T (√ √ )
δX T − δ −1 F Y δX T − δ −1 F Y ) ≥ 0.

AERO 632, Instructor: Raktim Bhattacharya 15 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Schur Complements
Very useful for identifying convex sets

Let
Q(x) ∈ Sm1 , R(x) ∈ Sm2
Q(x), R(x), S(x) are affine functions of x
[ ]
Q(x) S(x) Q(x) > 0
> 0 ⇐⇒
S T (x) R(x) R(x) − S T (x)Q(x)−1 S(x) > 0
Generalizing,
[ ] Q(x) ≥( 0
Q(x) S(x) )
≥ 0 ⇐⇒ S T (x) I − Q(x)Q† (x) = 0
S T (x) R(x)
R(x) − S T (x)Q(x)† S(x) ≥ 0

Q(x)† is the pseudo-inverse


This generalization is used when Q(x) is positive semidefinite
but singular
AERO 632, Instructor: Raktim Bhattacharya 16 / 38
Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Schur Complement Lemma


Let [ ]
A11 A12
A := .
A21 A22
Define

Sch (A11 ) := A22 − A21 A−1


11 A12
Sch (A22 ) := A11 − A12 A−1
22 A21

For symmetric A,

A > 0 ⇐⇒ A11 > 0, Sch (A11 ) > 0 ⇐⇒ A22 > 0, Sch (A22 ) > 0

AERO 632, Instructor: Raktim Bhattacharya 17 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Example 1
[ ]
I x
x21 + x22 < 1 ⇐⇒ 1 − x x > 0 ⇐⇒ T
>0
xT 1
Here
R(x) = 1,
Q(x) = I > 0.

AERO 632, Instructor: Raktim Bhattacharya 18 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Example 2
[ −1 ]
P x
∥x∥P < 1 ⇐⇒ 1 − x P x > 0 ⇐⇒ >0 T
xT 1
or
[ √ ]
√ √
1−x P x = 1−( P x)T ( P x) > 0 ⇐⇒
T √I T ( P x)
>0
( P x) 1

where P is matrix square root.

AERO 632, Instructor: Raktim Bhattacharya 19 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

LMIs are not unique


If F is positive definite then congruence transformation of F
is also positive definite

F > 0 ⇐⇒ xT F x, ∀x ̸= 0
⇐⇒ y T M T F M y > 0, ∀y ̸= 0 and nonsingular M
⇐⇒ M T F M > 0

Implies, rearrangement of matrix elements does not change


the feasible set
[ ] [ ][ ][ ] [ ]
Q S 0 I Q S 0 I R ST
> 0 ⇐⇒ > 0 ⇐⇒ >0
ST R I 0 ST R I 0 S Q

AERO 632, Instructor: Raktim Bhattacharya 20 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Variable Elimination Lemma


Lemma: For arbitrary nonzero vectors x, y ∈ Rn , there holds

max (xT F y)2 = (xT x)(y T y).


F ∈F:F T F ≤I

Proof: From Schwarz inequality,


√ √
|xT F y| ≤ xT x y T F T F y
√ √
≤ xT x y T y.

Therefore for arbitrary x, y we have

(xT F y)2 ≤ (xT x)(y T y).

Next show equality.

AERO 632, Instructor: Raktim Bhattacharya 21 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Variable Elimination Lemma


contd.

Let
xy T
F0 = √ √ .
xT x y T y
Therefore,
yxT xy T yy T
F0T F0 = = .
(xT x)(y T y) yT y
We can show that

σmax (F0T F0 ) = σmax (F0 F0T ) = 1.

=⇒ F0T F0 ≤ 1, thus F0 ∈ F .

AERO 632, Instructor: Raktim Bhattacharya 22 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Variable Elimination Lemma


contd.

Therefore,
( )2
xy T
(xT F0 y)2 = xT √ √ y = (xT x)(y T y).
xT x y T y

AERO 632, Instructor: Raktim Bhattacharya 23 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Variable Elimination Lemma


contd.

Lemma: Let X ∈ Rm×n , Y ∈ Rn×m , and Q ∈ Rm×m . Then

Q + XF Y + Y T F T X T < 0, ∀F ∈ F,

iff ∃ δ > 0 such that


1
Q + δXX T + Y T Y < 0.
δ
Proof: Sufficiency

1
Q + XF Y + Y T F T X T ≤ Q + δXX T + Y T Y from previous Lemma
δ
< 0.

AERO 632, Instructor: Raktim Bhattacharya 24 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Variable Elimination Lemma


contd.

Proof: Necessity
Suppose
Q + XF Y + Y T F T X T < 0, ∀F ∈ F
is true. Then for arbitrary nonzero x
xT (Q + XF Y + Y T F T X T )x < 0,
or
xT Qx + 2xT XF Y x < 0.
Using previous lemma result

max(xT XF Y x) = (xT XX T x)(xT Y T Y x),
F ∈F

=⇒ xT Qx + 2 (xT XX T x)(xT Y T Y x) < 0.

AERO 632, Instructor: Raktim Bhattacharya 25 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Variable Elimination Lemma


contd.


xT Qx + 2 (xT XX T x)(xT Y T Y x) < 0


=⇒ xT Qx − 2 (xT XX T x)(xT Y T Y x) < 0,
and xT Qx < 0.

Therefore,

(xT Qx)2 −4 (xT XX T x) (xT Y T Y x) > 0.


| {z } | {z } | {z }
b2 a c

or
b2 − 4ac > 0.

AERO 632, Instructor: Raktim Bhattacharya 26 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Variable Elimination Lemma


contd.

Or the quadratic equation

aδ 2 + bδ + c = 0

has real-roots √
−b ± b2 − 4ac
.
2a
Recall,

a := (xT XX T x) > 0, b := (xT Qx) < 0, c := (xT Y T Y x) > 0.

Implies
b
− > 0,
2a
or at least one positive root.
AERO 632, Instructor: Raktim Bhattacharya 27 / 38
Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Variable Elimination Lemma


contd.

Therefore, ∃δ > 0 such that

aδ 2 + bδ + c < 0.

Dividing by δ we get
c
aδ + b + < 0,
δ
or
1
xT Qx + δxT XX T x + xT Y T Y x < 0,
δ
or
1
xT (Q + δXX T + Y T Y )x < 0,
δ
or
1
Q + δXX T + Y T Y < 0.
δ
AERO 632, Instructor: Raktim Bhattacharya 28 / 38
Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
In a Partitioned Matrix

Lemma: Let [ ]
Z11 Z12
Z= T , Z11 ∈ Rn×n ,
Z12 Z22
be symmetric. Then ∃ X = X T such that
 
Z11 − X Z12 X
 Z12 T Z22 0  < 0 ⇐⇒ Z < 0.
X 0 −X

Proof: Apply Schur complement lemma.


 
Z11 − X Z12 X
 Z12 T Z22 0  < 0 ⇐⇒ −X < 0, Sch (−X) < 0.
X 0 −X

AERO 632, Instructor: Raktim Bhattacharya 29 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
In a Partitioned Matrix (contd.)

0 > Sch (−X),


[ ] [ ]
Z11 − X Z12 X [ ]
= T − (−X)−1 X 0 ,
Z12 Z22 0
[ ] [ ]
Z − X Z12 X 0
= 11 T + ,
Z12 Z22 0 0
[ ]
Z11 Z12
= T .
Z12 Z22

AERO 632, Instructor: Raktim Bhattacharya 30 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
In a Partitioned Matrix (contd.)

Lemma:
] [
 Z11 Z12 <0
Z11 Z12 Z13 T
Z12 Z22
Z12
T Z22 Z23 + X  < 0 ⇐⇒ [
T
]
T
Z13 T
Z23 + X Z33 Z11 Z13
T < 0,
Z13 Z33
with
T −1
X = Z13 Z11 Z12 − Z23
T
.
Proof: Necessity ⇒ Apply rules for negative definiteness.

Sufficiency ⇐ Following are true from Schur complement lemma.


Z11 <0
T −1 T −1
Z22 − Z12 Z11 Z12 <0 Z33 − Z13 Z11 Z13 < 0

AERO 632, Instructor: Raktim Bhattacharya 31 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
In a Partitioned Matrix (contd.)

Look at Schur complement of


 
Z11 Z12 Z13
Z12
T Z22 Z23 + X T  .
T T
Z13 Z23 + X Z33

[ ] [ T]
Z22 Z23 + X T Z −1
[ ]
T − 12 T Z11 Z12 Z13
Z23 + X Z33 Z13
[ −1 T Z −1 Z
]
Z22 − Z12 Z11 Z12
T Z23 + X T − Z12 11 13
= T + X − Z T Z −1 Z T Z −1 Z
Z23 13 11 12 Z33 − Z13 11 13
< 0.

Also Z!1 < 0.


AERO 632, Instructor: Raktim Bhattacharya 32 / 38
Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
Projection Lemma

Definition Let A ∈ Rm×n . Then Ma is left orthogonal complement


of A if it satisfies

Ma A = 0, rank(Ma ) = m − rank(A).

Definition Let A ∈ Rm×n . Then Na is right orthogonal


complement of A if it satisfies

ANa = 0, rank(Na ) = n − rank(A).

AERO 632, Instructor: Raktim Bhattacharya 33 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
Projection Lemma (contd.)

Lemma: Let P, Q, and H = H T be matrices of appropriate


dimensions. Let Np , Nq be right orthogonal complements of P, Q
respectively.

Then ∃ X such that

H + P T X T Q + QT XP < 0 ⇐⇒ NpT HNp < 0 and NqT HNq < 0.

Proof:
Necessity ⇒: Multiply by Np or Nq .
Sufficiency ⇐: Little more involved – Use base kernel of P, Q,
followed by Schur complement lemma.

AERO 632, Instructor: Raktim Bhattacharya 34 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
Reciprocal Projection Lemma

Lemma: Let P be any given positive definite matrix. The following


statements are equivalent:
1. Ψ + S + S T < 0.
2. The LMI problem
[ ]
Ψ + P − (W + W T ) S T + W T
< 0,
S+W −P

is feasible with respect to W .


Proof: Apply projection lemma w.r.t general variable W . Let
[ ]
Ψ + P ST [ ] [ ]
X= , Y = −In 0 , Z = In −In .
S −P

AERO 632, Instructor: Raktim Bhattacharya 35 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
Reciprocal Projection Lemma (contd.)

Let
[ ]
Ψ+P ST [ ] [ ]
X= , Y = −In 0 , Z = In −In .
S −P

Right orthogonal complements of Y, Z are


[ ] [ ]
0 I
Ny = , Nz = n .
−P −1 In
Verify that Y Ny = 0 and ZNz = 0.
We can show

NyT XNy = −P −1 , NzT XNz = Ψ + S T + S.

Apply projection lemma.


AERO 632, Instructor: Raktim Bhattacharya 36 / 38
Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Elimination of Variables
Reciprocal Projection Lemma (contd.)

NyT XNy = −P −1 , NzT XNz = Ψ + S T + S.

The expression
] [
T Ψ + P − (W + W T ) S T + W T
T T
X + Y W Z + Z WY = .
S+W −P

Therefore, if

NyT XNy < 0 [ ]


Ψ + P − (W + W T ) S T + W T
=⇒ < 0.
NzT XNz < 0 S+W −P

AERO 632, Instructor: Raktim Bhattacharya 37 / 38


Introduction to LMIs Generalized Square Inequalities Schur Complement Lemma Variable Elimination Lemma Trace of LMIs

Trace of Matrices in LMIs


Lemma Let A(x) ∈ Sm be a matrix function in Rn , and γ ∈ R > 0.
The following statements are equivalent:
1. ∃ x ∈ Rn such that
trA(x) < γ,
2. ∃ x ∈ Rn , Z ∈ Sm such that

A(x) < Z, trZ < γ.

Proof: Homework problem.

AERO 632, Instructor: Raktim Bhattacharya 38 / 38

You might also like