0% found this document useful (0 votes)
14 views20 pages

Understanding Vector Spaces and Axioms

Chap5_VECTOR spaces
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views20 pages

Understanding Vector Spaces and Axioms

Chap5_VECTOR spaces
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Chapter 5: Vector Spaces

1 A nonempty set V of objects (called elements or vectors) is called a vector


space over the scalars F (F = R or C) if the following axioms are satisfied.
2 Closure axioms: For every pair of elements x, y ∈ V there is a unique
element x + y ∈ V called the sum of x and y .
3 For every x ∈ V and every scalar α ∈ F there is a unique element αx ∈ V
called the product of α and x.
4 Axioms for vector addition: x + y = y + x for all x, y ∈ V .
5 x + (y + z) = (x + y ) + z for all x, y , z ∈ V .
6 There exists 0 in V such that x + 0 = 0 + x = x for all x ∈ V .
7 For x ∈ V there exists an element written as −x such that x + (−x) = 0.

1 / 20
Vector Spaces: Definition

1 Axioms for scalar multiplication:


2 (associativity) For all α, β ∈ F, x ∈ V ,

α(βx) = (αβ)x.

3 (distributive law for addition in V ) For all x, y ∈ V and α ∈ F,

α(x + y ) = αx + αy .

4 (distributive law for addition in F) For all α, β ∈ F and x ∈ V ,

(α + β)x = αx + βx

5 (existence of identity for multiplication) For all x ∈ V , 1x = x.


6 When F = R, we say that V is called a real vector space.
7 When F = C, we say that V is called a complex vector space.

2 / 20
Examples of vector spaces:

1 In the examples below we leave the verification of the axioms for vector
addition and scalar multiplication as exercises.
2 Let V = R, F = R with ordinary addition and multiplication as vector
addition and scalar multiplication. Then V is a real vector space.
3 Let V = C, F = C with ordinary addition and multiplication as vector
addition and scalar multiplication. Then V is a complex vector space.
4 Let V = C and F = R with ordinary addition and multiplication as vector
addition and scalar multiplication. Then V is a real vector space.
5 Let V = Rn = {(a1 , a2 , . . . , an )|a1 , . . . , an ∈ R} and F = R with addition of
row vectors as vector addition and multiplication of a row vector by a real
number as scalar multiplication. So Rn a real vector space.
6 We can similarly define a real vector space of real column vectors.
7 Depending on the context Rn could refer to either the set of all row vectors
or all column vectors with n real components.

3 / 20
Vector Spaces: Examples
1 Let V = Cn = {(a1 , a2 , . . . , an )|a1 , . . . , an ∈ C} and F = C with addition of
row vectors as vector addition and multiplication of a row vector by a
complex number as scalar multiplication. Then V is a complex vector space.
2 We can similarly define a complex vector space of column vectors with n
complex components.
3 Depending on the context Cn could refer to either row vectors or column
vectors with n complex components.
4 Let a < b be real numbers and set V = {f : [a, b] −→ R}, F = R.
5 If f , g ∈ V then we set (f + g )(x) = f (x) + g (x) for all x ∈ [a, b].
6 If a ∈ R and f ∈ V then (af )(x) = af (x) for all x ∈ [a, b].
7 V is a real vector space denoted by R[a,b] .
8 Let t be an indeterminate. The set
Pn (R) = {a0 + a1 t + · · · + an t n |a0 , a1 , . . . , an ∈ R}
is a real vector space under usual addition of polynomials and multiplication
of polynomials with real numbers.
4 / 20
Vector Spaces: Examples

1 C [a, b] := {f : [a, b] −→ R | f is continuous on [a, b]} is a real vector space


under addition of functions and scalar multiplication.
2 V = {f : [a, b] −→ R | f is differentiable at x ∈ [a, b], x fixed} is a real vector
space under addition and scalar multiplication of functions.
00 0
3 The set of all solutions to the differential equation y + ay + by = 0 where
a, b ∈ R form a real vector space.
4 Let V = Mm×n (R) denote the set of all m × n matrices with real entries.
Then V is a real vector space under usual matrix addition and multiplication
of a matrix by a real number.
5 The above examples indicate that the notion of a vector space is quite
general.
6 A result proved for vector spaces will simultaneously apply to all the above
different examples.

5 / 20
Subspace of a Vector Space
1 Definition. Let V be a vector space over F.
2 A nonempty subset W of V is called a subspace of V if
3 (a) 0 ∈ W (b) If u, v ∈ W then αu + βv ∈ W for all α, β ∈ F.
4 Definition. Let V be a vector space over F.
5 Let x1 , . . . , xn be vectors in V and let c1 , . . . , cn ∈ F.
Pn
6 The vector i=1 ci xi ∈ V is called a linear combination of xi ’s and ci are
called the coefficients of xi in this linear combination.
7 Definition. Let S be a subset of a vector space V over F.
8 The linear span of S is the subset of all vectors in V expressible as linear
combinations of finite subsets of S, i.e.,
( n )
X
L(S) = ci xi | n ≥ 1, x1 , x2 , . . . , xn ∈ S and c1 , c2 , . . . , cn ∈ F .
i=1

9 We say that L(S) is spanned by S.


6 / 20
Subspace of a Vector Space: Linear Span

1 Proposition. Let S be a subset of a vector space V . Then L(S) is the


smallest subspace of V containing S.
2 Proof. Note that L(S) is a subspace.
3 If S ⊂ W ⊂ V and W is a subspace of V then L(S) ⊂ W .
4 Let A be an m × n matrix over F. The row space of A, denoted R(A), is the
subspace of Fn spanned by the row vectors of A.
5 The column space of a A, denoted C(A), is the subspace of Fm spanned
by the column vectors of A.
6 The null space of A denoted N (A), is defined by

N (A) = {x ∈ Fn : Ax = 0}.

7 The null space of A is the set of all solutions of the homogeneous linear
equations Ax = 0 and so N (A) is a subspace of Fn .

7 / 20
Linear Span
1 Different sets may span the same subspace. For example,
L({e1 , e2 }) = L({e1 , e1 + e2 }) = R2 .
2 The vector space Pn (R) is spanned by {1, t, t 2 , . . . , t n } and also by
{1, (1 + t), . . . , (1 + t)n }.
3 We have introduced the notion of linear span of a subset S of a vector space.
This raises some natural questions:
4 Which spaces can be spanned by finite number of elements?
5 If V is a vector space, S ⊂ V and V = L(S) then what is the minimum
number of elements can S have?
6 To answer these questions we use the notions of linear dependence and
independence, basis and dimension of a vector space.
7 Definition. Let V be a vector space. A subset S ⊂ V is called linearly
dependent if there exist distinct v1 , v2 , . . . , vn ∈ S and scalars α1 , α2 , . . . , αn
not all zero such that
α1 v1 + α2 v2 + . . . + αn vn = 0.
8 / 20
Linearly Dependent and Independent subsets
1 Definition. A set S is called linearly independent (L.I.) if it is not linearly
dependent, i.e., for all n ≥ 1 and for all distinct v1 , v2 , . . . , vn ∈ S and scalars
α1 , α2 , . . . , αn
α1 v1 + α2 v2 + · · · + αn vn = 0 =⇒ αi = 0, for all i.
2 Convention. The empty set is linearly independent.
3 Proposition. (a) Any subset of V containing a linearly dependent set is
linearly dependent.
(b) Any subset of a linearly independent set in V is linearly independent.
(c) Let |S| ≥ 2. Then S is linearly dependent ⇐⇒ either 0 ∈ S or a vector
in S is a linear combination of other vectors in S.
(d) If S = {v } then S is linearly independent ⇐⇒ v 6= 0.
4 Example. Consider the vector space Rn and let S = {e1 , e2 , . . . , en }. Then S
is linearly independent. Indeed, if for some scalars α1 , α2 , . . . , αn
α1 e1 + α2 e2 + . . . + αn en = 0
then (α1 , α2 , . . . , αn ) = 0. So each αj = 0 and hence S is a linearly
independent set.
9 / 20
L.D. and L.I. subsets : Remarks and Examples

1 Example. Let S denote the subset of R4 consisting of the row vectors


       
2 1 0 0 0 , 1 1 0 0 , 1 1 1 0 and 1 1 1 1 .
Then
h S is linearlyi independent. h let α1 , α2i, α3 , αh4 ∈ R and i
To isee this,
3
h
α1 1 0 · · · 0 +α2 1 1 0 0 +α3 1 1 1 0 +α4 1 1 1 1 = 0.

4 Then α1 + α2 + α3 + α4 = 0, α2 + α3 + α4 = 0, α3 + α4 = 0 and α4 = 0,
that is, α4 = α3 = α2 = α1 = 0.
5 Example. Let V be the vector space of all continuous functions from R to
R. Let S = {1, cos2 t, sin2 t}.
6 Then the relation cos2 t + sin2 t − 1 = 0 shows that S is linearly dependent.

10 / 20
Examples of Linearly dependent and linearly independent
subsets
1 Example. Let α1 < α2 < . . . < αn be real numbers. Let
V = {f : R −→ R | f is continuous}.
2 Consider the set S = {e α1 x , e α2 x , . . . , e αn x }.
3 We show that S is linearly independent using induction on n.
4 Let n = 1 and βe α1 x = 0. Since e α1 x 6= 0 for any x, we get β = 0.
5 Now assume that the assertion is true for n − 1. We prove it for n. Let
β1 , . . . , bn ∈ R so that

β1 e α1 x + . . . + βn e αn x = 0.

6 Then β1 e (α1 −αn )x + · · · + βn e (αn −αn )x = 0.


7 Let x −→ ∞ to get βn = 0.
8 Now apply induction hypothesis to get β1 = . . . = βn−1 = 0.
11 / 20
Examples of Linearly dependent and linearly independent
subsets
1 Example. Let P denote the vector space of all polynomials p(t) with real
coefficients. Then the set S = {1, t, t 2 , . . .} is linearly independent. Suppose
that 0 ≤ n1 < n2 < . . . < nr and

α1 t n1 + α2 t n2 + . . . + αr t nr = 0

2 for certain real numbers α1 , α2 , . . . , αr . Differentiate n1 times to get α1 = 0.


Continuing this way we see that all α1 , α2 , . . . , αr are zero.
3 Bases and dimension of a vector space. A vector space may be realized as
linear span of several sets of different sizes.
4 We shall now study properties of the smallest sets whose linear span is a
given vector space.
5 Definition. A subset S of a vector space V is called a basis of V if elements
of S are linearly independent and V = L(S).
6 A vector space V possessing a finite basis is called finite dimensional.
7 Otherwise V is called infinite dimensional.
12 / 20
Bases and Dimension
1 Proposition. Let {v1 , . . . , vn } be a basis of a finite dimensional vector space
V . Then every v ∈ V can be written as
v = a1 v1 + · · · + an vn for unique scalars a1 , . . . , an .
2 Proof. Let v = b1 v1 + b2 v2 + · · · + bn vn for some scalars b1 , b2 , . . . , bn ∈ F.
Then v − v = 0 = (a1 − b1 )v1 + (a2 − b2 )v2 + · · · + (an − bn )vn = 0.
by the linear independence of v1 , v2 , . . . , vn , aj − bj = 0 for all j.
3 Hence a1 , a2 , . . . , an are uniquely determined.
4 Theorem. All bases of a finite dimensional vector space have same number
of elements.
5 For this we prove the following result.
6 Lemma. Let S = {v1 , v2 , . . . , vk } be a subset of a vector space V . Then any
k + 1 elements in L(S) are linearly dependent.
7 Proof. Let T = {u1 , . . . , uk+1 } ⊆ L(S). Write
k
X
ui = aij vj , i = 1, . . . , k + 1.
j=1
8 Consider the (k + 1) × k matrix A = (aij ).
13 / 20
Bases and Dimension
1 Since A has more rows than columns there exists a nonzero row vector
c = [c1 , . . . , ck+1 ] such that cA = 0, i.e., for j = 1, . . . k
k+1
X
ci aij = 0.
i=1

2 Therefore
  !
k+1
X k+1
X Xk k
X k+1
X
ci ui = ci  aij vj  = ci aij vj = 0,
i=1 i=1 j=1 j=1 i=1

3 This shows that u1 , u2 , . . . , uk+1 are linearly dependent.


4 Theorem. Any two bases of a finite dimensional vector space have same
number of elements.
5 Proof. Suppose |S| < |T |. Since T ⊂ L(S) = V , T is linearly dependent.
This is a contradiction.
6 Definition. The number of elements in a basis of a finite-dimensional vector
space V is called the dimension of V . It is denoted by dim V .
14 / 20
Bases and Dimension: Examples

1 Examples: The set {e1 , e2 , . . . , en } in Rn is a basis.


2 The columns of A ∈ Fn×n form a basis of Fn ⇐⇒ A is invertible.
3 Pn (R) = {a0 + a1 t + . . . + an t n | a0 , a1 , . . . , an ∈ R} is spanned by
S = {1, t, t 2 , . . . , t n }. Since S is LI, dim Pn (R) = n + 1.
4 Let eij denote the m × n matrix with 1 in (i, j)th position and 0 elsewhere. If
Pm Pn
A = (aij ) ∈ Fm×n then A = i=1 j=1 aij eij .
5 It is easy to see that the mn matrices Eij are linearly independent. Hence
Fm×n is an mn−dimensional vector space.
6 What is the dimension of Mn×n (C) as a real vector space?
7 Proposition. Let S be a linearly independent subset of a finite dimensional
vector space V . Then S can be enlarged to a basis of V .
8 Proof. Suppose that dim V = n and S has less than n elements.
9 Let v ∈ V \ L(S). Then S ∪ {v } is a linearly independent subset of V .
10 Continuing this way we can enlarge S to a basis of V .
15 / 20
Gauss elimination, row space, and column space
1 Proposition. Let V = L(u1 , u2 , . . . , un ) and dim V = d > 0. Then we can
choose linearly independent vectors uj1 , uj2 , . . . , ujd so that
V = L(uj1 , uj2 , . . . , ujd ).
2 Proof. Since d ≥ 1, Let uj1 be a nonzero vector and W = L(uj1 ).
3 If d = 1, then we are done. If d ≥ 2, pick uj2 6∈ L(uj1 ).
4 Then uj1 , uj2 are linearly independent. If d = 2, then V = L(uj1 , uj2 ).
5 Continue this way to get d vectors in {u1 , u2 , . . . , un } whose linear span is V .
6 Proposition. Let A ∈ Fm×n and E ∈ Fm×m be invertible. Then
(1) R(A) = R(EA). Hence dim R(A) = dim R(EA).
(2) Let 1 ≤ i1 < i2 < · · · < ik ≤ n. The columns {i1 , . . . , ik } of A are linearly
independent ⇐⇒ the columns {i1 , . . . , ik } of EA are linearly independent.
(3) dim C(A) = dim C(EA).
7 Proof. (1) Note that R(EA) ⊆ R(A) since every row of EA is a linear
combination of the rows of A. Similarly,

R(A) = R(E −1 (EA)) ⊆ R(EA).


16 / 20
Row rank and column rank are equal for any matrix
1 To prove (2), observe that

α1 (EA)i1 + α2 (EA)i2 + · · · + αk (EA)ik = 0


⇐⇒ E (α1 Ai1 + α2 Ai2 + · · · + αk Aik ) = 0
−1
⇐⇒ E (E (α1 Ai1 + α2 Ai2 + · · · + αk Aik )) = 0
⇐⇒ α1 Ai1 + α2 Ai2 + · · · + αk Aik = 0

2 Hence dim C(A) = dim C(EA).


3 Theorem. Let A be an m × n matrix. Then dim R(A) = dim C(A).
4 Proof. Apply row operations to reduce A to the RCF U.
5 Therefore A = EU, where E is a product of elementary matrices.
6 Let the first k rows of U be nonzero. Then U has k pivotal columns.
7 Then the first k rows of U are a basis of R(A).
8 Suppose that j1 , . . . , jk are the pivotal columns of U.
9 Then columns j1 , . . . , jk of A form a basis of C(A).
17 / 20
Row and column spaces of a matrix

1 Example: Let A be a 4 × 6 matrix whose RCF is

1 2 3 0 5 0
 
 0 0 0 1 7 0 
U=
 0

0 0 0 0 1 
0 0 0 0 0 0

2 {A1 , A4 , A6 } a basis of C(A) and the nonzero rows of U form a basis of R(A).
3 Definition. The rank of a matrix A, denoted by rank (A), is
dim R(A) = dim C(A). The nullity of A is dim N (A).
4 The Rank-Nullity Theorem: Let A ∈ Fm×n . Then
rank A + nullity A = n.
5 Proof. Let V = Fn . Let B = {v1 , v2 , . . . , vk } be a basis of N (A).
6 Extend B to a basis C = {v1 , v2 , . . . , vk , w1 , w2 , . . . , wn−k } of V .
7 We show that D = {A(w1 ), A(w2 ), . . . , A(wn−k )} is a basis of C(A).

18 / 20
The Rank and Nullity of a Matrix
1 Any v ∈ V can be expressed uniquely as

v = α1 v1 + α2 v2 + · · · + αk vk + β1 w1 + · · · + βn−k wn−k .
=⇒ Av = α1 A(v1 ) + · · · + αk A(vk ) + β1 A(w1 ) + · · · + βn−k A(wn−k )
= β1 A(w1 ) + · · · + βn−k A(wn−k ).
2 Hence D spans C(A). It remains to show that D is linearly independent.
3 Suppose β1 A(w1 ) + · · · + βn−k A(wn−k ) = 0.
4 Then A(β1 w1 + · · · + βn−k wn−k ) = 0 =⇒ β1 w1 + · · · + βn−k wn−k ∈ N (A).
5 Therefore there are scalars α1 , α2 , . . . , αk such that

α1 v1 + α2 v2 + · · · + αk vk = β1 w1 + β2 w2 + · · · + βn−k wn−k .
6 By linear independence of {v1 , v2 , . . . , vk , w1 , w2 , . . . , wn−k } we conclude that
β1 = β2 = · · · = βn−k = 0.
7 Therefore D is a basis of C(A). Hence

rank A + nullity A = n.
19 / 20
Rank in terms of determinants
1 Definition. An r × r submatrix of A is called minor of order r of A.
2 Theorem. A matrix A has rank r ≥ 1 ⇐⇒ det M 6= 0 for some order r
minor M of A and det N = 0 for all order r + 1 minors N of A.
3 Proof. Let rank A = r ≥ 1. Then some r columns of A are L. I.
4 Let B be the m × r matrix consisting of these r columns of A.
5 Then rank (B) = r and thus some r rows of B are be linearly independent.
Let C be the r × r matrix having these r rows of B.
6 Then det(C ) 6= 0, since C is invertible, hence Cx = 0 =⇒ x = 0.
7 Let N be a (r + 1) × (r + 1) minor of A.
8 Without loss of generality we may take N to consist of the first r + 1 rows
and columns of A, since the interchanges of rows or interchanges of columns
does not change the rank of the matrix.
9 Suppose det(N) 6= 0. Then the r + 1 rows of N, and hence the first r + 1
rows of A, are linearly independent, a contradiction.
10 The converse is left as an exercise.
20 / 20

You might also like