0% found this document useful (0 votes)
11 views51 pages

Lec8 Projection, Gram Schmidt Orthogonalization, Projection Into

This lecture covers the concepts of dot product spaces, including definitions, examples, and properties of norms, angles between vectors, orthogonality, and orthogonal complements. It also introduces orthogonal sets and bases, emphasizing the importance of orthonormal sets in vector spaces. Key theorems related to these concepts are presented to illustrate their significance in linear algebra.

Uploaded by

adilhan200721
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views51 pages

Lec8 Projection, Gram Schmidt Orthogonalization, Projection Into

This lecture covers the concepts of dot product spaces, including definitions, examples, and properties of norms, angles between vectors, orthogonality, and orthogonal complements. It also introduces orthogonal sets and bases, emphasizing the importance of orthonormal sets in vector spaces. Key theorems related to these concepts are presented to illustrate their significance in linear algebra.

Uploaded by

adilhan200721
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

Lecture #8

PROJECTION,
GRAM-SCHMIDT
ORTHOGONALIZATION
Askarbekkyzy Aknur
Ph.D. candidate, Senior-lecturer
Dot product spaces
Definition 1. Let 𝑉 be a real vector space. Suppose to each pair
of vectors 𝑢, 𝑣 ∈ 𝑉 there is assigned a real number, denoted by
𝑢, 𝑣 . This function is called a (real) dot product on V if it
satisfies the following axioms:
1. [Linear:]
𝑎 ⋅ 𝑢! + 𝑏 ⋅ 𝑢" , , 𝑣 = 𝑎 ⋅ 𝑢! , 𝑣 + 𝑏 ⋅ ⟨𝑢" , 𝑣⟩;
2. [Symmetric:]
⟨𝑢, 𝑣⟩ = ⟨𝑣, 𝑢⟩;
3. [Positive definite:]
𝑢, 𝑢 ≥ 0 and ⟨𝑢, 𝑢⟩ = 0 if and only if 𝑢 = 0.
Examples of dot product spaces
Example 1. Euclidean 𝑛-Space 𝑅! :
The dot product in 𝑅! is defined by
𝑢, 𝑣 = 𝑢" 𝑣" + 𝑢# 𝑣# + ⋯ + 𝑢! 𝑣!
1 5
Let 𝑢 = −3 , 𝑣 = −6 .Then
5 1
⟨𝑢, 𝑣⟩ = 1 ⋅ 5 + −3 ⋅ −6 + 5 ⋅ 1 = 5 + 18 + 5 = 28
Example 2. Function Space 𝐶 𝑎, 𝑏 and Polynomial Space
𝑃 𝑡 :
𝐶 𝑎, 𝑏 is the vector space of all continuous functions on
the closed interval 𝑎, 𝑏 . The dot product is defined by
%

𝑓, 𝑔 = ; 𝑓 𝑡 𝑔 𝑡 𝑑𝑡
$
The vector space 𝑃 𝑡 of all polynomials is a subspace of
𝐶 𝑎, 𝑏 for any interval [𝑎, 𝑏], and hence, the above is also
a dot product on 𝑃(𝑡).
Example 2 (cont.). Suppose 𝐶 0, 1 is the vector space
of all continuous functions on the closed interval 0, 1 .
The dot product is defined by
"

𝑓, 𝑔 = ; 𝑓 𝑡 𝑔 𝑡 𝑑𝑡
&
Let 𝑓 𝑡 = 𝑡 and 𝑔 𝑡 = 𝑡 − 1 in the space 𝐶 0,1 . Then
" "
𝑡' 𝑡#
1
𝑓, 𝑔 = ; 𝑡 𝑡 − 1 𝑑𝑡 = − B = −
3 2 6
& &
Example 3. Matrix Space 𝑀 = 𝑀(×! :
A dot product is defined on 𝑀 by
𝐴, 𝐵 = 𝑡𝑟 𝐵* ⋅ 𝐴
where 𝑡𝑟 𝐴 is the trace — the sum of the diagonal
elements.
& (

𝐴, 𝐵 = 𝑡𝑟 𝐵# ⋅ 𝐴 = ( ( 𝑎$' 𝑏$'
$%! '%!
& (
"
∥ 𝐴 ∥" = 𝐴, 𝐴 = ( ( 𝑎$'
$%! '%!
Example 3 (cont.).
1 2 5 6
Let 𝐴 = and 𝐵 = . Then
3 4 7 8

# 5 7 1 2
𝐴, 𝐵 = 𝑡𝑟 𝐵 ⋅ 𝐴 = 𝑡𝑟 ⋅ =
6 8 3 4
19 20
= 𝑡𝑟 = 19 + 50 = 69
43 50
Norm of a vector

Definition 2. Let 𝑉 be a dot product space and 𝑢 ∈ 𝑉.


Non-negative number 𝑢, 𝑢 is called norm or length of
𝑢. We use the notation
𝑢 = 𝑢, 𝑢
Norm of a vector

Definition 3. If ∥ 𝑢 ∥= 1 or, equivalently, if 𝑢, 𝑢 = 1,


then 𝑢 is called a unit vector and it is said to be
normalized.

Every nonzero vector 𝑣 in 𝑉 can be multiplied by the


reciprocal of its length to obtain the unit vector
1
𝑣L = 𝑣
∥𝑣∥
1/6
5/6
Example 4. Suppose 𝑢 = . Then
−1/6
1/2
1 1 5 5 1 1 1 1
𝑢 = ⋅ + ⋅ + − ⋅ − + ⋅ =
6 6 6 6 6 6 2 2
1 25 1 1
= + +⋅ + = 1.
36 36 36 4
Hence, 𝑢 is a unit vector.
1
Example 5. Suppose 𝑣 = −3 .
5
𝑢 = 1 ⋅ 1 + (−3) ⋅ (−3) + 5 ⋅ 5 = 35.
Hence, 𝑣 is not a unit vector. By normalizing 𝑣 one can
have unit vector from 𝑣:
1 1/ 35
𝑢 1
𝑢0 = = −3 = −3/ 35
||𝑢|| 35 5
5/ 35
Cauchy-Schwarz Inequality

Theorem 1. For any vectors 𝑢 and 𝑣 in a dot product


space 𝑉,
𝑢, 𝑣 # ≤ 𝑢, 𝑢 ⋅ ⟨𝑣, 𝑣⟩
Theorem 2. Let 𝑉 be a dot product space. Then the norm
in 𝑉 satisfies the following properties:
1. ∥ 𝑣 ∥ ≥ 0; and ∥ 𝑣 ∥= 0 if and only if 𝑣 = 0;
2. ∥ 𝑘 · 𝑣 ∥= 𝑘 ·∥ 𝑣 ∥;
3. ∥ 𝑢 + 𝑣 ∥ ≤ ∥ 𝑢 ∥+∥ 𝑣 ∥.
Angle between vectors

Definition 4. For any non-zero vectors 𝑢 and 𝑣 in a dot


product space 𝑉, the angle between 𝒖 and 𝒗 is defined to
be the angle 𝜃 such that 0 ≤ 𝜃 ≤ 𝜋 and
𝑢, 𝑣
cos 𝜃 =
∥ 𝑢 ∥·∥ 𝑣 ∥
Angle between vectors

Example 6. Find the angle between vectors 𝑓 𝑡 = sin 𝑡


and 𝑔 𝑡 = cos 𝑡 in the space 𝐶 −𝜋; 𝜋 .
Example 6. Find the angle between vectors 𝑓 𝑡 = 𝑡 and 𝑔 𝑡 = 𝑡 − 1 in
the space 𝐶 0; 1 .
!
Solution: From Example 2, 𝑓 𝑡 , 𝑔 𝑡 = −
"
! !
$
𝑡 1
𝑓 𝑡 = 𝑓 𝑡 ,𝑓 𝑡 = + 𝑡 ⋅ 𝑡 𝑑𝑡 = . =
3 3
# #

! !
% 𝑑𝑡
𝑡$ %
1
𝑔 𝑡 = 𝑔 𝑡 ,𝑔 𝑡 = + 𝑡−1 = − 𝑡 + 𝑡0 =
3 # 3
#
1
−6 1
cos 𝑡;5
𝑡−1 = =−
1 1 2

3 3
Orthogonality

Definition 5. Let 𝑉 be a dot product space. The vectors


𝑢, 𝑣 ∈ 𝑉 are said to be orthogonal and 𝑢 is said to be
orthogonal to 𝑣 if
𝑢, 𝑣 = 0
Orthogonality

Definition 5. Let 𝑉 be a dot product space. The vectors


𝑢, 𝑣 ∈ 𝑉 are said to be orthogonal and 𝑢 is said to be
orthogonal to 𝑣 if
𝑢, 𝑣 = 0
Observe that 𝑢 and 𝑣 are orthogonal if and only if
cos 𝜃 = 0, where 𝜃 is the angle between 𝑢 and 𝑣. Also,
this is true if and only if 𝑢 and 𝑣 are “perpendicular”.
Orthogonal Complements

Definition 6. Let 𝑆 be a subset of a dot product space 𝑉.


The orthogonal complement of 𝑆, denoted by 𝑆 + consists
of those vectors in 𝑉 that are orthogonal to every vector
𝑢 ∈ 𝑆; that is,
𝑆 + = 𝑣 ∈ 𝑉: 𝑢, 𝑣 = 0 for every 𝑢 ∈ 𝑆
Orthogonal Complements

Definition 6. Let 𝑆 be a subset of a dot product space 𝑉.


The orthogonal complement of 𝑆, denoted by 𝑆 + consists
of those vectors in 𝑉 that are orthogonal to every vector
𝑢 ∈ 𝑆; that is,
𝑆 + = 𝑣 ∈ 𝑉: 𝑢, 𝑣 = 0 for every 𝑢 ∈ 𝑆
In particular, for a given vector 𝑢 in 𝑉, we have
𝑢+ = 𝑣 ∈ 𝑉: 𝑣, 𝑢 = 0
Orthogonal Complements

Example 7. Find nonzero vector 𝑤 that is orthogonal to


1 5
𝑢 = −3 and 𝑣 = 6 .
5 1
Example 7 (cont.). Find nonzero vector 𝑤 that is
1 5
orthogonal to 𝑢 = −3 and 𝑣 = 6 .
5 1
𝑥
Solution: Let 𝑤 = 𝑦 . To be orthogonal we have 𝑤 · 𝑢 = 0
𝑧
and 𝑤 · 𝑣 = 0. They imply
𝑥 − 3𝑦 + 5𝑧 = 0 𝑥 − 3𝑦 + 5𝑧 = 0
b ⟺ b
5𝑥 + 6𝑦 + 𝑧 = 0 21𝑦 − 24𝑧 = 0
Example 7 (cont.). 𝑧 is free and
11
𝑥=− 𝑧
7
8
𝑦= 𝑧
7
−11
Let 𝑧 = 7. Then 𝑦 = 8 and 𝑥 = −11. Thus, 𝑤 = 8 is
7
1 5
orthogonal to 𝑢 = −3 and 𝑣 = 6 .
5 1
Properties

Theorem 3. Let 𝑆 be a subset of a vector space 𝑉. Then


𝑆 + is a subspace of 𝑉.

Suppose 𝑊 is a subspace of 𝑉. Then both 𝑊 and 𝑊 + are


subspaces of 𝑉.
Theorem 4. Let 𝑊 be a subspace of 𝑉. Then 𝑉 is the
direct sum of 𝑊 and 𝑊 + ; that is, 𝑉 = 𝑊 ⊕ 𝑊 +
Orthogonal Sets and Bases

Definition 7. Consider a set 𝑆 = 𝑢" , 𝑢# , … , 𝑢, of


nonzero vectors in a dot product space 𝑉. 𝑆 is called
orthogonal if each pair of vectors in 𝑆 are orthogonal, and
𝑆 is called orthonormal if 𝑆 is orthogonal and each vector
in S has unit length. That is,
■ Orthogonal: 𝑢- , 𝑢. = 0 for 𝑖 ≠ 𝑗;
0, for 𝑖 ≠ 𝑗,
■ Orthonormal: 𝑢- , 𝑢. = b
1, for 𝑖 = 𝑗.
Orthogonal Sets and Bases

Normalizing an orthogonal set 𝑆 refers to the process of


multiplying each vector in 𝑆 by the reciprocal of its length
in order to transform 𝑆 into an orthonormal set of
vectors.
Properties

Theorem 5. Suppose 𝑆 is an orthogonal set of nonzero


vectors. Then 𝑆 is linearly independent.
Properties

Theorem 5. Suppose 𝑆 is an orthogonal set of nonzero


vectors. Then 𝑆 is linearly independent.

Theorem 6. Suppose 𝑢" , 𝑢# , … , 𝑢, is an orthogonal set


of vectors. Then
" " " "
𝑢! + 𝑢" + ⋯ + 𝑢) = 𝑢! + 𝑢" + ⋯ + 𝑢)
Example

Let 𝑉 = 𝐶 −𝜋, 𝜋 be the vector space of continuous functions


on the interval −𝜋, 𝜋 with dot product defined by
+

𝑓, 𝑔 = ; 𝑓(𝑡)𝑔(𝑡)𝑑𝑡
*+
Then the following is a classical example of an orthogonal set in
𝑉:
{1, cos 𝑡 , sin 𝑡 , cos 2𝑡 , sin 2𝑡 , cos 3𝑡 , sin 3𝑡 , … }
Orthogonal Projection into a line
Orthogonal Projection into a line

Definition 8. The orthogonal projection of 𝑣 into the line


spanned by a non-zero 𝑠 is this vector
𝑣, 𝑠
proj / 𝑣 = 𝑠
𝑠, 𝑠
Here by [𝑠] we denote the line spanned by the vector 𝑠.
2
Example 8. Find orthogonal projection of vector
3
into the line 𝑦 = 2𝑥.
𝑥 1
Solution: 𝑦 = 𝑥: 𝑥 ∈ ℝ which means the line 𝑦 =
2
1
2𝑥 is spanned by the vector .
2
2 1
, 8 1 8/5
2 3 2 1
proj 01#2 = ⋅ = ⋅ =
3 1 1 2 5 2 16/5
,
2 2
Theorem 7. Suppose 𝑤" , 𝑤# , … , 𝑤, form an orthogonal
set of nonzero vectors in 𝑉. Let 𝑣 be any vector in 𝑉.
Define
𝑣′ = 𝑣 − 𝑐" 𝑤" + 𝑐# 𝑤# + ⋯ + 𝑐, 𝑤,
where
3,5! 3,5" 3,5#
𝑐" = , 𝑐# = , … , 𝑐, =
5! ,5! 5" ,5" 5# ,5#
Then 𝑣′ is orthogonal to 𝑤" , 𝑤# , … , 𝑤, .
Theorem 7. Suppose 𝑤! , 𝑤" , … , 𝑤) form an orthogonal set of
nonzero vectors in 𝑉. Let 𝑣 be any vector in 𝑉. Define
𝑣′ = 𝑣 − 𝑐! 𝑤! + 𝑐" 𝑤" + ⋯ + 𝑐) 𝑤)
where
,,.& ,,.' ,,.(
𝑐! = , 𝑐" = , … , 𝑐) =
.&,.& .',.' .(,.(
Then 𝑣′ is orthogonal to 𝑤! , 𝑤" , … , 𝑤) .

If 𝑊 = span 𝑤! , 𝑤" , … , 𝑤) , where the 𝑤$ form an orthogonal


set, then
proj𝑊 𝑣 = 𝑐! 𝑤! + 𝑐" 𝑤" + ⋯ + 𝑐) 𝑤)
Here 𝑐$ is the component of 𝑣 along 𝑤$ , as above.
Gram-Schmidt Orthogonalization Process
Suppose 𝑣! , 𝑣" , … , 𝑣( is a basis of a dot product space 𝑉.
One can use this basis to construct an orthogonal basis
𝑤! , 𝑤" , … , 𝑤( of 𝑉 as follows. Set
■ 𝑤! = 𝑣! ;
,' ,.&
■ 𝑤" = 𝑣" − 𝑤! ;
.& ,.&
,) ,.& ,) ,.'
■ 𝑤/ = 𝑣/ − 𝑤! − 𝑤" ;
.& ,.& .' ,.'
■ .................................
'!"# ,)# '!"# ,)$ '!"# ,)!"#
■ 𝑤$ = 𝑣$%& − 𝑤& − 𝑤* − ⋯ − 𝑤$%& .
)# ,)# )$ ,)$ )!"# ,)!"#
Example 9. Using the Gram-Schmidt orthogonalization process,
we orthogonalize the basis
1 −1 0
𝑣! = 2 , 𝑣" = 0 , 𝑣/ = 0
2 2 1
1
Solution: 𝑤! = 𝑣! = 2 ,
2
𝑣" ⋅ 𝑤! −1 1⋅ −1 + 2 ⋅ 0 + 2 ⋅ 2 1
𝑤" = 𝑣" − 𝑤! = 0 − " " " 2 =
𝑤! ⋅ 𝑤! 1 +2 +2
2 2
−1 −4/3
3 1
= 0 − 2 = −2/3
9
2 2 4/3
Example 9 (cont.). Using the Gram-Schmidt
orthogonalization process, we orthogonalize the basis
1 −1 0
𝑣" = 2 , 𝑣# = 0 , 𝑣' = 0
2 2 1
Solution (cont.):
𝑣' ⋅ 𝑤" 𝑣' ⋅ 𝑤#
𝑤' = 𝑣' − 𝑤" − 𝑤# =
𝑤" ⋅ 𝑤" 𝑤# ⋅ 𝑤#
0 1 −4/3 2/9
2 4/3
= 0 − 2 − −2/3 = −2/9 .
9 4
1 2 4/3 1/9
Example 9 (cont.). So we have obtained three orthogonal
basis vectors
4 2

3 9
1 2 2
𝑤" = 2 , 𝑤# = − , 𝑤' = − .
2 3 9
4 1
3 9
Example 9 (cont.). Now we normailize these vectors and have
orthonormal basis vectors
1 4 2
− −
3 3 3
𝑤! 1 1 2 𝑤" 1 2 1
𝑤
'! = = 2 = ,𝑤
'" = = − = −
||𝑤! || 3 3 ||𝑤" || 2 3 3
2
2 4 2
3 2 3 3
2
9 5
𝑤# 1 2 2
𝑤'# = = − = −
||𝑤# || 5 9 5
9 1 1
9 5
Example 10. Let 𝑉 be the space of polynomials 𝑓 𝑡 with
dot product
"

𝑓, 𝑔 = ; 𝑓(𝑡)𝑔(𝑡)𝑑𝑡
6"
Apply the Gram–Schmidt orthogonalization process to
1, 𝑡, 𝑡 # , 𝑡 ' to find an orthogonal basis for 𝑃' (𝑡).
Example 10 (cont.).
Solution: 𝑤" = 1;
𝑡, 1 0
𝑤# = 𝑡 − ⋅1=𝑡− ⋅1=𝑡−0=𝑡
1,1 1
" "
𝑡#
𝑡, 1 = ; 𝑡 ⋅ 1 𝑑𝑡 = B = 0;
2
6" 6"
"
"
1,1 = ; 1 ⋅ 1 𝑑𝑡 = 𝑡t = 2.
6"
6"
Example 10 (cont.).
" "
"
𝑡 , 1 𝑡 ,𝑡
𝑤! = 𝑡 − ⋅1− ⋅𝑡 =
1,1 𝑡, 𝑡
2
" 3 0 "
1 "
1
=𝑡 − ⋅1− ⋅𝑡 =𝑡 − −0=𝑡 −
2 2 3 3
3
$ $ $ $
! %
" "
𝑡 2 " "
𝑡
𝑡 , 1 = , 𝑡 ⋅ 1 𝑑𝑡 = . = ; 𝑡 , 𝑡 = , 𝑡 ⋅ 𝑡 𝑑𝑡 = . =0
3 3 4
#$ #$ #$ #$
$ $
!
𝑡 2
𝑡, 𝑡 = , 𝑡 ⋅ 𝑡 𝑑𝑡 = . =
3 3
#$ #$
Example 10 (cont.).
1
𝑡 ", 1 𝑡 ", 𝑡 𝑡 ", 𝑡 # − 3 1
𝑤! = 𝑡" − ⋅1− ⋅𝑡− ⋅ 𝑡# − =
1,1 𝑡, 𝑡 # 1 # 1 3
𝑡 − 3,𝑡 − 3
2
"
0 5 0 #
1 "
3 "
3
=𝑡 − ⋅1− ⋅𝑡− ⋅ 𝑡 − =𝑡 − 𝑡−0=𝑡 − 𝑡
2 2 8 3 5 5
%
3 45 %
% %
𝑡 ! 𝑡 & 2
" " " "
𝑡 , 1 = . 𝑡 ⋅ 1 𝑑𝑡 = 0 = 0; 𝑡 , 𝑡 = . 𝑡 ⋅ 𝑡 𝑑𝑡 = 0 =
4 5 5
$% $% $% $%
% %
#
1 # 1 1 𝑡& 2 " 𝑡 8
𝑡# #
− ,𝑡 − = . 𝑡 − 𝑑𝑡 = − 𝑡 + 0 =
3 3 3 5 9 9 45
$% $%
% %
1 𝑡" 𝑡' 𝑡!
𝑡 ", 𝑡 # − = . 𝑡 & − 𝑑𝑡 = − 0 =0
3 3 6 12
$% $%
Example 10 (cont.).
" '
𝑤" = 1, 𝑡# '
𝑤# = 𝑡, 𝑤' = − and 𝑤7 = 𝑡 − 𝑡
' 8
# " ' '
Answer: 1; 𝑡; 𝑡 − ; 𝑡 − 𝑡 is orthogonal basis for
' 8
𝑃' (𝑡).
Orthogonal Matrices

Definition 9. A real matrix 𝑃 is orthogonal if 𝑃 is non-


singular and 𝑃6" = 𝑃* , or, in other words, if
𝑃 · 𝑃* = 𝑃* · 𝑃 = 𝐼
Example 11. Let
3/7 2/7 6/7 3 2 6
&
𝑃 = −6/7 3/7 2/7 = −6 3 2 . Then it’s transpose is
+
2/7 6/7 −3/7 2 6 −3
1 3 −6 2
𝑃, = 2 3 6
7
6 2 −3
1 3 2 6 1 3 −6 2
𝑃 ⋅ 𝑃, = ⋅ −6 3 2 ⋅ 2 3 6 =
7 7
2 6 −3 6 2 −3
1 49 0 0 1 0 0
= 0 49 0 = 0 1 0
49
0 0 49 0 0 1
Orthogonal Matrices

Theorem 8. Let 𝑃 be a real matrix. Then the following are


equivalent:
1. 𝑃 is orthogonal;
2. The rows of 𝑃 form an orthonormal set;
3. The columns of 𝑃 form an orthonormal set.
Properties

Theorem 9. Suppose 𝐸 and 𝐸′ are orthonormal bases of


𝑉. Let 𝑃 be the change of basis matrix from the basis 𝐸 to
the basis 𝐸′ . Then 𝑃 is orthogonal.
Properties

Theorem 10. Let 𝑒" , … , 𝑒! be an orthonormal basis of


a dot product space 𝑉. Let 𝑃 be an orthogonal matrix with
entries 𝑎-. . Then the following 𝑛 vectors form an
orthonormal basis for 𝑉:
𝑒-9 = 𝑎"- 𝑒" + 𝑎#- 𝑒# + ⋯ + 𝑎!- 𝑒! , 𝑖 = 1, 2, … , 𝑛.
Exercises for Lecture 8

2
2
1. Find orthogonal projection of vector into the line
1
3
1
−1
spanned by vector .
1
−1
Exercises for Lecture 8

2. Apply the Gram–Schmidt orthogonalization process to


find an orthogonal basis and then an orthonormal basis
for the subspace 𝑈 of 𝑅7 spanned by
1 1 1
1 2 −3
𝑢" = , 𝑢# = , 𝑢' =
1 4 −4
1 5 −2

You might also like