100% found this document useful (1 vote)
171 views

MA 106: Linear Algebra: Prof. B.V. Limaye IIT Dharwad

1) The document summarizes a lecture on inner product spaces, which are vector spaces with a prescribed inner product. 2) An inner product on a vector space V satisfies properties like being positive definite and linearly related to scalar multiplication. 3) Examples of inner product spaces include Kn×1 with the usual dot product and the space of continuous functions on an interval with an integral inner product. 4) Key results proven include the Schwarz inequality and properties of orthogonal projections and orthonormal sets.

Uploaded by

amar Baronia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
171 views

MA 106: Linear Algebra: Prof. B.V. Limaye IIT Dharwad

1) The document summarizes a lecture on inner product spaces, which are vector spaces with a prescribed inner product. 2) An inner product on a vector space V satisfies properties like being positive definite and linearly related to scalar multiplication. 3) Examples of inner product spaces include Kn×1 with the usual dot product and the space of continuous functions on an interval with an integral inner product. 4) Key results proven include the Schwarz inequality and properties of orthogonal projections and orthonormal sets.

Uploaded by

amar Baronia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

MA 106: Linear Algebra

Lecture 21

Prof. B.V. Limaye


IIT Dharwad

Thursday, 22 February 2018

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Inner Product Spaces
In Lecture 12, we introduced the inner product

⟨x, y⟩ := x∗ y = x 1 y1 + · · · + x n yn
[ ]T
of a column vector x := x1 · · · xn ∈ Kn×1 with a column
[ ]T
vector y := y1 · · · yn ∈ Kn×1 . We also pointed out (and
used) some crucial properties of the inner product function
⟨· , ·⟩ : Kn×1 × Kn×1 → K.
The above definition of an inner product does not make sense
for elements of an abstract vector space. In fact, not all vector
spaces have usable inner product functions.
We, therefore, focus on those vector spaces for which there is
a function having the crucial properties of the inner product
on Kn×1 mentioned above. . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Let V be a vector space over K. An inner product on V is
function ⟨· , ·⟩ : V ×V → K satisfying the following properties.
For u, v , w ∈ V and α ∈ K,
1. ⟨v , v ⟩ ≥ 0 and ⟨v , v ⟩ = 0 ⇐⇒ v = 0, (positive definite)
2. ⟨u, αv + βw ⟩ = α⟨u, v ⟩+β⟨u, w ⟩, (linear in 2nd variable)
3. ⟨v , u⟩ = ⟨u, v ⟩. (conjugate symmetric)
From the above properties, conjugate linearity in the 1st
variable follows: ⟨αu + βv , w ⟩ = α⟨u, w ⟩ + β⟨v , w ⟩.
If u, v ∈ V and ⟨u, v ⟩ = 0, then we say that u and v are
orthogonal, and we write u ⊥ v .
For v ∈ V , we define the norm of v by ∥v ∥ := ⟨v , v ⟩1/2 .
Clearly, v ⊥ v ⇐⇒ ∥v ∥ = 0 ⇐⇒ v = 0.
If v ∈ V and ∥v ∥ = 1, then we say that v is a unit vector or
a unit function. The set {v ∈ V : ∥v ∥ ≤ 1} is called the
unit ball of V . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
A vector space V over K with a prescribed inner product on it
is called an inner product space.
Examples
1 . We have already studied the primary example, namely
V := Kn×1 with the usual inner product ⟨x, y⟩ := x∗ y for
x, y ∈ Kn×1 . There are other inner products on Kn×1 . For
example, let w1 , . . . , wn be positive real numbers, and define

⟨x, y⟩ := w1 x 1 y1 + · · · + wn x n yn for x, y ∈ Kn×1 .

On the other hand, the function on R4×1 × R4×1 defined by

⟨x, y⟩M := x1 y1 + x2 y2 + x3 y3 − x4 y4 for x, y ∈ R4×1

is not an inner product on R4×1 . Note that for x ∈ R4×1 ,


⟨x, x⟩M = x12 + x22 + x32 − x42 . (This is used in defining the
Minkowski space, and space-like, time-like as well as
light-like vectors in it.) . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .
.
.
.
.
. .
. .
.

B.V. Limaye, IITDH MA 106: Lec-21


2 . Let V := C ([a, b]), the vector space of all continuous
K-valued functions on [a, b]. Define
∫ b
⟨f , g ⟩ := f (t)g (t)dt for f , g ∈ V .
a

It is easy to check that this is an inner product on V . We shall


call this inner product the usual inner product on C ([a, b]).
(∫ )1/2
b
In this case, the norm of f ∈ V is ∥f ∥ := a |f (t)|2 dt .
This example gives a continuous analogue of the usual inner
product on Kn×1 .
There are other inner products on V . For example, let
w : [a, b] → R be positive function, and define
∫ b
⟨f , g ⟩ := w (t)f (t)g (t)dt for f , g ∈ V .
a
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Our proof of the Schwarz inequality ‘|⟨x, y⟩ ≤ ∥x∥∥y∥ for
x, y ∈ Kn×1 ’ used inequalities involving the components of the
column vectors x and y. Clearly, such a proof would not work
in the general case. We now give a proof which uses the
orthogonal projection of an element V in the direction of a
nonzero element of V .
Let w be a nonzero element of V . As in Lecture 13, define
⟨w , v ⟩
Pw (v ) := w for v ∈ V .
⟨w , w ⟩

It is called the (perpendicular) projection of v in the


direction of w . It is easy to see that Pw : V → V is a linear
map and its image space is one dimensional. Also,
Pw (w ) = w , so that (Pw )2 := Pw ◦ Pw = Pw .
Note that Pw (v ) is a scalar multiple of w for every v ∈ V .
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Two important properties of the projection of a vector in the
direction of another (nonzero) vector are as follows.
Proposition
Let w ∈ V be nonzero. Then for every v ∈ V ,
( )
(i) v − Pw (v ) ⊥ w and (ii) ∥Pw (v )∥ ≤ ∥v ∥.

Proof. Let v ∈ V . For (i), we note that

⟨w , v ⟩
⟨w , v − Pw (v )⟩ = ⟨w , v ⟩−⟨w , Pw (v )⟩ = ⟨w , v ⟩− ⟨w , w ⟩ = 0.
⟨w , w ⟩

For (ii), we write v = Pw (v ) + u, where u := v − Pw (v ). Then

∥v ∥2 = ⟨v , v ⟩ = ⟨Pw (v ), Pw (v ) + u⟩ + ⟨u, Pw (v ) + u⟩
= ∥Pw (v )∥2 + ∥u∥2 (by (i) above)
≥ ∥Pw (v )∥2 .
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Theorem
Let ⟨· , ·⟩ be an inner product on a vector space V , and let
v , w ∈ V . Then
(i) (Schwarz Inequality) |⟨v , w ⟩| ≤ ∥v ∥∥w ∥.
(ii) (Triangle Inequality) ∥v + w ∥ ≤ ∥v ∥ + ∥w ∥.

Proof.
(i) Let w = 0. Then ⟨v , w ⟩ = ⟨v , 0⟩ = ⟨v , 0 + 0⟩ = 2⟨v , 0⟩,
and so ⟨v , w ⟩ = 0. Also, ∥w ∥ = 0. Hence we are done.
Now suppose w ̸= 0. Then by (ii) of the previous proposition,

⟨w , v ⟩

⟨w , w ⟩ w = ∥Pw (v )∥ ≤ ∥v ∥,

that is, |⟨w , v ⟩|∥w ∥ ≤ ∥v ∥⟨w , w ⟩ = ∥v ∥∥w ∥2 .


Hence |⟨v , w ⟩| ≤ ∥v ∥∥w ∥.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
(ii) Since ⟨v , w ⟩ + ⟨w , v ⟩ = 2 R ⟨v , w ⟩, we see that

∥v + w ∥2 = ⟨v + w , v + w ⟩ = ∥v ∥2 + ∥w ∥2 + 2 R ⟨v , w ⟩
≤ ∥v ∥2 + ∥w ∥2 + 2 |⟨v , w ⟩|
≤ ∥v ∥2 + ∥w ∥2 + 2 ∥v ∥∥w ∥ (by (i) above)
= (∥v ∥ + ∥w ∥)2 .

Thus ∥v + w ∥ ≤ ∥v ∥ + ∥w ∥.
We observe that the norm function ∥ · ∥ : V → K satisfies the
following three crucial properties:
(i) ∥v ∥ ≥ 0 for all v ∈ V and ∥v ∥ = 0 ⇐⇒ v = 0,
(ii) ∥αv ∥ = |α|∥v ∥ for all α ∈ K and v ∈ V ,
(iii) ∥v + w ∥ ≤ ∥v ∥ + ∥w ∥ for all v , w ∈ V .

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Let V be an inner product space.
The Pythagorus Theorem says that if v , w ∈ V and v ⊥ w ,
then ∥v + w ∥2 = ∥v ∥2 + ∥w ∥2 . The proof is exactly as before.
Let E be a subset of V . Define

E ⊥ := {w ∈ V : w ⊥ v for all v ∈ E }.

It is easy to see that E ⊥ is a subspace of V .


The set E is said to be orthogonal if any two (distinct)
element of E are orthogonal (to each other), that is, v ⊥ w
for all v , w in E with v ̸= w . An orthogonal set whose
elements are unit vectors is called an orthonormal set.
If E is orthogonal and does not contain 0, then E is linearly
independent. For example, let V := C ([−π, π]) and
E := {cos nt : n ∈ N} ∪ {sin nt : n ∈ N}. Since E is
orthogonal and 0 ̸∈ E , the set E is linearly . independent.
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
If we are given a countable linearly independent subset of V ,
then we can construct an orthogonal subset of V not
containing 0, retaining the span of the elements so
constructed at every step. This can be done in a variety of
ways. The most well-known among them is the Gram-Schmidt
Orthogonalization Process (G-S OP). We outline it below.
Let (vn ) be a sequence of linearly independent elements in V .
Define w1 := v1 . Let j ∈ N, and suppose we have found
w1 , . . . , wj in V such that the set {w1 , . . . , wj } is orthogonal,
and also span{w1 , . . . , wj } = span{v1 , . . . , vj }.
Define wj+1 := vj+1 − Pw1 (vj+1 ) − · · · − Pwj (vj+1 )
⟨w1 , vj+1 ⟩ ⟨wj , vj+1 ⟩
= vj+1 − w1 − · · · − wj .
⟨w1 , w1 ⟩ ⟨wj , wj ⟩
Then span{w1 , . . . , wj+1 } = span{v1 , . . . , vj+1 } since
wj+1 ∈ span{w1 , . . . , wj , vj+1 } = span{v1 , . . . , vj , vj+1 } and
vj+1 ∈ span{w1 , . . . , wj , wj+1 }. . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .
.
.
.
.
. .
. .
.

B.V. Limaye, IITDH MA 106: Lec-21


To show that the set {w1 , . . . , wj+1 } is orthogonal, it is
enough to show that wj+1 ∈ {w1 , . . . , wj }⊥ .
Let i ∈ {1, . . . , j}. Then

⟨wi , wj+1 ⟩ = ⟨wi , vj+1 − Pw1 (vj+1 ) − · · · − Pwj (vj+1 )⟩


= ⟨wi , vj+1 ⟩ − ⟨wi , Pw1 (vj+1 )⟩ − · · · − ⟨wi , Pwj (vj+1 )⟩
= ⟨wi , vj+1 ⟩ − ⟨wi , Pwi (vj+1 )⟩ (since wi ⊥ wj , i ̸= j)
= ⟨wi , vj+1 − Pwi (vj+1 )⟩
= 0 (by the crucial property of the projection).

Note that since the set {v1 , v2 , . . .} is linearly independent, all


vectors constructed in the G-S OP are nonzero: Clearly,
w1 = v1 ̸= 0. Also, if wj+1 = 0 for some j ∈ N, then vj+1
would belong to span{w1 , . . . , wj } = span{v1 , . . . , vj }, which is
not possible. This completes the construction of the G-S OP.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Now let uj := wj /∥wj ∥ for j ∈ N, then (u1 , u2 , . . .) is an
ordered orthonormal set such that for each j ∈ N,

span{v1 , . . . , vj } = span{w1 , . . . , wj } = span{u1 , . . . , uj }.

Example
Let V be the set of all real-valued polynomial functions on
[−1, 1] along with the inner product defined by
∫ 1
⟨p, q⟩ := p(t)q(t)dt for p, q ∈ V .
−1

For j = 0, 1, 2, . . ., let pj (t) := t j , t ∈ [−1, 1]. Let us


orthogonalize the set {p0 , p1 , p2 , p3 }. Define q0 := p0 , and
( ∫ 1 )
⟨q0 , p1 ⟩ 1
q1 := p1 − q0 = p1 − t dt p0 = p1 .
⟨q0 , q0 ⟩ 2 −1
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Next, define
⟨q0 , p2 ⟩ ⟨q1 , p2 ⟩
q2 := p2 − q0 − q1
⟨q0 , q0 ⟩ ⟨q1 , q1 ⟩
( ∫ 1 ) ( ∫ 1 )
1 3
= p2 − t dt q0 −
2 3
t dt q1
2 −1 2 −1
1
= p2 − p0 ,
3
and similarly,

⟨q0 , p3 ⟩ ⟨q1 , p3 ⟩ ⟨q2 , p3 ⟩


q3 := p3 − q0 − q1 − q2
⟨q0 , q0 ⟩ ⟨q1 , q1 ⟩ ⟨q2 , q2 ⟩
3
= p3 − p1 .
5
.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
√ √ √ √ √
Further, ∥q
√ √
0 ∥ = 2, ∥q1 ∥ = 2/ 3, ∥q2 ∥ = 2 2/3 5 and
∥q3 ∥ = 2 2/5 7.
Hence we obtain the following orthonormal subset of V having
the same span as span{p0 , p1 , p2 , p3 }, namely all polynomial
functions of degree at most 3:
√ √
2 6
u0 (t) := , u1 (t) := t,
√2 2 √
10 14
u2 (t) := (3t − 1), u3 (t) :=
2
(5t 3 − 3t).
4 4
The sequence of orthonormal polynomials thus obtained by
orthonormalizing the monomials by the G-S OP is known as
the sequence of Legendre polynomials. These polynomials
are used extensively while considering solutions of differential
equations.
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Let V be a finite dimensional inner product space. An
orthonormal basis for V is a basis for V which is an
orthonormal subset of V .
We have proved the following results for subspaces of Kn×1 .
Their proofs remain valid for any inner product space.
If u1 , . . . , uk is an orthonormal set in V , then we can extend it
to an orthonormal basis. As a consequence, every nonzero
vector subspace V has an orthonormal basis.
The G-S OP enables us to improve the quality of a given basis
for V by orthonormalizing it. For instance, if {u1 , . . . , un } is
an orthonormal basis for V , and v ∈ V , then it is extremely
easy to write v as a linear combination of u1 , . . . , un ; in fact

v = ⟨u1 , v ⟩u1 + · · · + ⟨un , v ⟩un .

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Orthogonal Projections
Let W be a subspace of a finite dimensional inner product
space V . The Orthogonal Projection Theorem says that
for every v ∈ V , there are unique w ∈ W and w̃ ∈ W ⊥ such
that v = w + w̃ , that is, V = W ⊕ W ⊥ . The map
PW : V → V given by PW (v ) = w is linear and satisfies
(PW )2 = PW .
In fact, if u1 , . . . , uk is an orthonormal basis for W , then
PW (v ) = ⟨u1 , v ⟩u1 + · · · + ⟨uk , v ⟩uk and w̃ = v − w .
The linear map PW : V → V is called the orthogonal
projection of V onto the subspace W .
Given v ∈ V , its orthogonal projection PW (v ) into W is the
unique best approximation to v from W .
Further, PW (v ) is the unique element of W such that
v − PW (v ) is orthogonal to W . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .
.
.
.
.
. .
. .
.

B.V. Limaye, IITDH MA 106: Lec-21


Linear Maps on Inner Product Spaces

Let V be a vector space of dimension n, and let


E := (v1 , . . . , vn ) be an ordered basis for V . Also, let W be an
inner product space of dimension m, and let F := (u1 , . . . , um )
be an ordered orthonormal basis for W . Let T : V → W be a
linear map. Then

T (v1 ) = ⟨u1 , T (v1 )⟩u1 +· · · +⟨uj , T (v1 )⟩uj + · · · +⟨um , T (v1 )⟩um ,
.. .. ..
. . .
T (vk ) = ⟨u1 , T (vk )⟩u1 +· · · +⟨uj , T (vk )⟩uj + · · · +⟨um , T (vk )⟩um ,
.. .. ..
. . .
T (vn ) = ⟨u1 , T (vn )⟩u1 +· · · +⟨uj , T (vn )⟩uj + · · · +⟨um , T (vn )⟩um .

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Hence the m × n matrix of the linear map T with respect to
the basis E of V and the basis F of W is

MEF (T ) = [⟨uj , T (vk )⟩].

Now suppose V is an inner product space of dimension n, and


let E := (u1 , . . . , un ) be an ordered orthonormal basis for V .
Let T : V → V br a linear operator. Then the matrix of this
map with respect to the orthonormal basis E of V is
MEE (T ) = [⟨uj , T (uk )⟩].
We say that the linear operator T : V → V is Hermitian if

⟨T (u), v ⟩ = ⟨u, T (v )⟩ for all u, v ∈ V ,

and it is called skew-Hermitian if ⟨T (u), v ⟩ = −⟨u, T (v )⟩


for all u, v ∈ V .
. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Note that for A ∈ Kn×n and x, y ∈ Kn×1 ,
⟨A x, y⟩ = (A x)∗ y = x∗ (A∗ y) = ⟨x, A∗ y⟩.
Hence matrix A is self-adjoint, that is, A∗ = A if and only if
⟨A x, y⟩ = ⟨x, Ay⟩. Hence the following result is natural.
Proposition
Let V be a finite dimensional inner product space, and let
T : V → V be a linear map. Then T is Hermitian if and only
if the matrix of T with respect to every ordered orthonormal
basis of V is self-adjoint.
Proof. Let E := (u1 , . . . , un ) be an ordered orthonormal basis
for V , and let A := [ajk ] denote the matrix of T with respect
to E . Then ajk = ⟨uj , T (uk )⟩ for all j, k = 1, . . . , n.
Suppose T is Hermitian. Then
ajk = ⟨uj , T (uk )⟩ = ⟨T (uj ), uk ⟩ = ⟨uk , T (uj )⟩ = ajk for all
j, k = 1, . . . , n. Hence A is self-adjoint. . .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .
.
.
.
. .
. .
.

B.V. Limaye, IITDH MA 106: Lec-21


Conversely, suppose A is self-adjoint, that is . Then ajk = ajk ,
that is, ⟨T (uj ), uk ⟩ = ⟨uj , T (uk )⟩ for all j, k = 1, . . . , n.
∑ ∑
Let u, v ∈ V , and let u = nj=1 αj uj and v = nk=1 βk uk
where α1 , . . . , αn , β1 , . . . , βn ∈ K. Then
⟨∑
n ∑
n ⟩ ∑
⟨T (u), v ⟩ = αj T (uj ), βk uk = αj βk ⟨T (uj ), uk ⟩,
j=1 j=1 j,k
⟨∑n ∑
n ⟩ ∑
⟨u, T (v )⟩ = αj uj , βk T (uk ) = αj βk ⟨uj , T (uk )⟩,
j=1 k=1 j,k

where ⟨T (uj ), uk ⟩ = ⟨uj , T (uk )⟩ for all j, k = 1, . . . , n, and so


⟨T (u), v ⟩ = ⟨u, T (v )⟩ for all u, v ∈ V . Thus T is Hermitian,
as desired.

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Using the above result, it is possible to prove the spectral
theorem for a Hermitian operator on a finite dimensional inner
product space V . Moreover, one can define the adjoint T ∗ of
a linear operator T on V , and prove a spectral theorem for an
operator T which commutes with its adjoint T ∗, that is, for a
normal operator on V . However, we shall stop here! Finally!

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
Using the above result, it is possible to prove the spectral
theorem for a Hermitian operator on a finite dimensional inner
product space V . Moreover, one can define the adjoint T ∗ of
a linear operator T on V , and prove a spectral theorem for an
operator T which commutes with its adjoint T ∗, that is, for a
normal operator on V . However, we shall stop here! Finally!

THE END

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
The following books can be used as references.
(1) Advanced Engineering Mathematics (Tenth Edition)
written by Erwin Kreyszig, and published by
John Wiley & Sons, Hoboken, N.J. (2011).

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
The following books can be used as references.
(1) Advanced Engineering Mathematics (Tenth Edition)
written by Erwin Kreyszig, and published by
John Wiley & Sons, Hoboken, N.J. (2011).
(2) Elementary Linear Algebra with Applications
(Tenth Edition) written by H. Anton, and published by
John Wiley & Sons, Hoboken, N.J. (2010).

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
The following books can be used as references.
(1) Advanced Engineering Mathematics (Tenth Edition)
written by Erwin Kreyszig, and published by
John Wiley & Sons, Hoboken, N.J. (2011).
(2) Elementary Linear Algebra with Applications
(Tenth Edition) written by H. Anton, and published by
John Wiley & Sons, Hoboken, N.J. (2010).
Acknowledgement

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
The following books can be used as references.
(1) Advanced Engineering Mathematics (Tenth Edition)
written by Erwin Kreyszig, and published by
John Wiley & Sons, Hoboken, N.J. (2011).
(2) Elementary Linear Algebra with Applications
(Tenth Edition) written by H. Anton, and published by
John Wiley & Sons, Hoboken, N.J. (2010).
Acknowledgement
I have made use of the lecture notes of similar courses
prepared by Murali Srinivasan and Jugal Verma (Spring
Semester of 2013 – 2014) and Akhil Ranjan (Spring Semester
of 2016 – 2017) at the Indian Institute of Technology Bombay.

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
A Chinese Poem

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
A Chinese Poem
There once lived a man
who wanted to learn how to kill dragons.

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
A Chinese Poem
There once lived a man
who wanted to learn how to kill dragons.

He enrolled in an Institute of Technology


and spent hours learning how to kill dragons.

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
A Chinese Poem
There once lived a man
who wanted to learn how to kill dragons.

He enrolled in an Institute of Technology


and spent hours learning how to kill dragons.

After four long years, he graduated ‘summa cum laude’


and was ready to practice the art he had learned.

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
A Chinese Poem
There once lived a man
who wanted to learn how to kill dragons.

He enrolled in an Institute of Technology


and spent hours learning how to kill dragons.

After four long years, he graduated ‘summa cum laude’


and was ready to practice the art he had learned.

Alas! He soon found that there were no dragons to kill.

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
A Chinese Poem
There once lived a man
who wanted to learn how to kill dragons.

He enrolled in an Institute of Technology


and spent hours learning how to kill dragons.

After four long years, he graduated ‘summa cum laude’


and was ready to practice the art he had learned.

Alas! He soon found that there were no dragons to kill.

Then he joined the same Institute of Technology

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21
A Chinese Poem
There once lived a man
who wanted to learn how to kill dragons.

He enrolled in an Institute of Technology


and spent hours learning how to kill dragons.

After four long years, he graduated ‘summa cum laude’


and was ready to practice the art he had learned.

Alas! He soon found that there were no dragons to kill.

Then he joined the same Institute of Technology


and started teaching how to kill dragons!

. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . .
B.V. Limaye, IITDH MA 106: Lec-21

You might also like