INNER PRODUCT SPACE (1)
INNER PRODUCT SPACE (1)
December 4, 2024
1 Introduction
2 Orthogonality
3 Gram-Schmidt orthogonalization
4 Orthogonal Complement
5 Orthogonal matrices
Inner product
Let F be the field of real numbers or the field of complex numbers, and V
be a vector space over F . An inner product on V is a function that assigns
to each ordered pair of vectors α, β ∈ V a scalar (α|β) ∈ F in such a way
that the following properties hold for all α, β, γ ∈ V and all scalars c:
(α + β|γ) = (α|γ) + (β|γ) (Linearity in the first argument)
(cα|β) = c(α|β) (Homogeneity in the first argument)
(β|α) = (α|β) (Conjugate symmetry)
(α|α) > 0 if α ̸= 0 (Positive definiteness)
⟨u+v, w⟩ = (u1 +v1 )w1 +(u2 +v2 )w2 = u1 w1 +v1 w1 +u2 w2 +v2 w2 = ⟨u, w⟩+⟨v
Similarly,
3. Positive Definiteness:
Orthogonal Vectors
Let α and β be vectors in an inner product space V . Then α is orthogonal
to β if (α|β) = 0
Orthogonal Set
If S is a set of vectors in V , S is called an Orthogonal Set provided all
pairs of distinct vectors in S are orthogonal.
Orthonormal Set
An Orthonormal Set is an orthogonal set S with an additional property
that ||α|| = 1 for every α in S.
u · v = 1 · 2 + (−2) · 1 + 1 · 2 = 2 − 2 + 2 = 0
To verify:
v1 · v2 = 1 · 0 + 0 · 1 + 0 · 0 = 0
v1 · v3 = 1 · 0 + 0 · 0 + 0 · 1 = 0
v2 · v3 = 0 · 0 + 1 · 0 + 0 · 1 = 0
Thus, {v1 , v2 , v3 } is an orthogonal set.
Theorem 1
An orthogonal set of non-zero vectors is linearly independent.
(c1 v1 + c2 v2 + · · · + cn vn ) · vi = 0 · vi .
ci ∥vi ∥2 = 0.
ci = 0.
Theorem 2
Let V be an inner product space and v1 , . . . , vn be any linearly
independent vectors in V. Then one may construct orthogonal vectors
u1 , . . . , un in V such that the set {u1 , . . . , un } is a basis for the subspace
spanned by v1 , . . . , vn .
u1 = v1 .
Corollary
Every finite-dimensional inner product space has an orthonormal basis.
Orhogonal Complement
Let S be a nonempty subset of an inner product space V . We denote by
S ⊥ , the set of all vectors of V that are orthogonal to every vector of S,
called the orthogonal complement of S in V . In notation,
u⊥ = {v ∈ V | ⟨v, u⟩ = 0}.
Proposition
Let S be a nonempty subset of an inner product space V . Then the
orthogonal complement S ⊥ is a subspace of V .
⟨x + y, v⟩ = 0.
Therefore, x + y ∈ S ⊥ .
3. Closure under Scalar Multiplication: Let x ∈ S ⊥ and a ∈ R (or C,
depending on the field of V ). We need to show that ax ∈ S ⊥ . For any
v ∈ S:
⟨ax, v⟩ = a⟨x, v⟩.
Since ⟨x, v⟩ = 0, it follows that:
⟨ax, v⟩ = a · 0 = 0.
Thus, ax ∈ S ⊥ .
Definition
A square matrix A is said to be an orthogonal matrix if it has orthonormal
columns
Note:For square orthogonal matrices, the transpose is the inverse i.e.
A−1 = AT
=⇒ AAT = AT A = I
When row i of AT multiplies column j of A, the result is qiT qj = 0. On the
diagonal where i = j, we have qiT qi = 1. That is the normalization to unit
vectors of length 1.