0% found this document useful (0 votes)
35 views

If You Find Any Errors, Please Let Your Instructor Know ASAP

This document provides solutions to 5 problems about properties of matrices. Problem 1 proves that if a matrix A is skew-Hermitian, then all its eigenvalues are pure imaginary. Problem 2 shows that if a matrix is Hermitian or skew-Hermitian, its right eigenvectors are also left eigenvectors. Problem 3 constructs the square root of a positive semidefinite matrix. Problem 4 derives an expression for the maximum eigenvalue of a positive semidefinite matrix. Problem 5 analyzes an iterative algorithm that converges to the eigenvector of the largest eigenvalue of a positive definite matrix.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

If You Find Any Errors, Please Let Your Instructor Know ASAP

This document provides solutions to 5 problems about properties of matrices. Problem 1 proves that if a matrix A is skew-Hermitian, then all its eigenvalues are pure imaginary. Problem 2 shows that if a matrix is Hermitian or skew-Hermitian, its right eigenvectors are also left eigenvectors. Problem 3 constructs the square root of a positive semidefinite matrix. Problem 4 derives an expression for the maximum eigenvalue of a positive semidefinite matrix. Problem 5 analyzes an iterative algorithm that converges to the eigenvector of the largest eigenvalue of a positive definite matrix.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Autumn 2019 Math 104 - HW 7 Solutions

(If you find any errors, please let your instructor know ASAP)

Problem 1 (Chapter 9, Exercise 2). Suppose A is skew-Hermitian, i.e., AH = −A. Prove that all
its eigenvalues are pure imaginary.

Solution. Let B = iA, then B H = −iAH = iA = B and therefore B is Hermitian. Since B has
all real eigenvalues {λ1 , · · · , λn }, all the eigenvalues of A are of the form {−iλ1 , · · · , −iλn }, and
thus all pure imaginary.

Problem 2 (Chapter 9, Exercise 3). If A is an n × n Hermitian matrix with eigenvalue λ and


corresponding right eigenvector x, show that x is also a left eigenvector for λ. Prove the same result
if A is skew-Hermitian.

Solution. Let x be a right eigenvector of A with eigenvalue λ, then by definition

Ax = λx
Taking Hermitian transpose on both sides,

xH AH = λ̄xH

• If A is Hermitian, then AH = A, and all eigenvalues are real, λ̄ = λ, so

xH A = λxH

i.e. x is also a left eigenvector of A with the same eigenvalue λ.

• If A is skew-Hermitian, then AH = −A, and by Problem 9.2, all eigenvalues are purely
imaginary, λ̄ = −λ, so
−xH A = −λxH
i.e. x is also a left eigenvector of A with the same eigenvalue λ.

Problem 3. Suppose A is positive semidefinite. Can you a find a square root of this matrix? In
other words, can you find a matrix B such that B 2 = A? If yes, explain how you would construct it.
If no, explain why no such matrix exists.

Solution. Yes, we can. Notice that A can be written as A = V ΣV > where V is unitary, Σ is
diagonal. The diagonal elements of Σ are all the eigenvalues of A, and therefore are all
√ non-negative.
>
Consider B = V DV where D is a diagonal matrix with diagonal elements Dii = Σii . Then we
have B 2 = V DV > V DV > = A.
Problem 4. Suppose A is positive semidefinite. Show that the maximum eigenvalue of A, denoted
by λmax , is given by the so-called Rayleigh quoient

w> Aw
sup >
w6=0 w w

Solution. Let w0 be an eigenvector of λmax , then we have

w0> Aw0
= λmax .
w0> w0
On the other hand, denote the spectral representation of A by
n
X
λi vi vi> ,
i=1

then we have
Pn Pn
w> Aw i=1 λi khw, vi ik
2
λmax khw, vi ik2
i=1
= ≤ = λmax .
w> w kwk 2 kwk2
Therefore, we have
w> Aw
sup >
= λmax ,
w6=0 w w

as desired.
Problem 5. (a) Consider an n × n positive definite matrix A with a largest eigenvalue greater than
the second largest. Consider the following algorithm: start with a random vector x(0) ∈ Rn such
that kx(0) k2 = 1 for t = 1, 2, 3, · · · recursively define:

x(t) = Ax(t−1) /kAx(t−1) k2 .

What does this algorithm converge to?


Solution. We claim that the algorithm will ‘almost’ coverge to the eigenvector with respect to the
largest eigenvalue.
Consider the spectral representation of A = ni=1 λi vi viT where {v1 , · · · , vn } is an orthonormal
P
basis of A, λ1 > λ2 ≥ · · · ≥ λn are all the eigenvalues. Then we have
n
X
Ax = λi hx, vi ivi .
i=1

Therefore, by applying A and normalizing n times, we have


n
(k) 1 X k (0)
x = λ hx , vi ivi ,
Ck i=1 i

where C is the normalizing constant.


Suppose hx(0) , v1 i = 6 0 (which means the initial vector is not orthogonal to v1 ), we claim that
λk1 hx(0) ,v1 i
| Cn |→ 1.
pPn
Let cik = λki hx(0) , vi i, then we have Ck = 2
i=1 cik . Notice that

c1k λ1 hx(0) , v1 i
| |= | |k | (0) |→∞
cik λi hx , vi i

for all i > 1, as λ1 is the unique largest eigenvector.


Therefore,as k → ∞, if hx(0) , v1 i > 0, x(k) → v1 , if hx(0) , v1 i < 0, x(k) → −v1 . In either case,
the algorithm converges to the eigenvector with respect to the largest eigenvalue.
(The edge case is hx(0) , v1 i = 0, which has probablity 0 if we choose a direction uniformly at
random. In that case, the algorithm will converge to vk or −vk , where k is the smallest integer such
that x is not orthogonal with vk .)

You might also like