0% found this document useful (0 votes)
11 views

Linear Mapping g4

The document discusses linear mappings and transformations, defining them as mathematical operations that convert input values into output values using linear functions. It explains the properties of linear transformations, including zero and identity transformations, kernel and range spaces, and provides examples of linear transformations such as differentiation. Additionally, it emphasizes the importance of linear mappings in machine learning and their applications in various contexts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Linear Mapping g4

The document discusses linear mappings and transformations, defining them as mathematical operations that convert input values into output values using linear functions. It explains the properties of linear transformations, including zero and identity transformations, kernel and range spaces, and provides examples of linear transformations such as differentiation. Additionally, it emphasizes the importance of linear mappings in machine learning and their applications in various contexts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Linear

Mapping
Objectives
a. Define mappings and functions

:
in the context of linear
transformations
b. Explain the properties of
mappings and functions in the
context of linear transformations
Linear
Linear mapping is a mathematical operation that

Mapping
transforms a set of input values into a set of output
values using a linear function. In machine learning,
linear mapping is often used as a preprocessing step
to transform the input data into a more suitable
format for analysis. Linear mapping can also be used
as a model in itself, such as in linear regression or
linear classifiers.
LINEAR
MAPPING
The linear mapping function can be represented as
follows:
y = Wx + b where x is the input vector, W is the
weight matrix, b is the bias vector, and y is the
output vector. The weight matrix and bias vector are
learned during the training process.
LINEAR
Let V and W be vector spaces over a field K.

MAPPING
A function f: V → W is called a linear map if, for any
vectors u, v ∈ V and a scalar c ∈ K, the following
conditions hold:
• If the transformation is additive in nature: f (u + v) =
f(u) + f(v)
• If they are multiplicative in nature in terms of a
scalar f(cu)=c⋅ f(u)
ZERO/IDENTITY
A linear transformation T: V→V from a vector space
TRANSFORMATION
into itself is called a Linear operator:

is called zero-transformation if: T (𝑣)=0∀ V


• Zero-Transformation: For a transformation T: V→W

T: V→V is called identitytransformation if: T (𝑣)=𝑣∀ V


• Identity-Transformation: For a transformation
Let T: V→W be the linear transformation where 𝑢, 𝑣
PROPERTIES OF LINEAR
∈ 𝑉 Then, the following properties are true:
TRANSFORMATION
LINEAR TRANSFORMATION OF MATRIX
Let T be a mxn matrix, the transformation 𝑇 :R ⁿ
→ R is a linear transformation if: T (𝑣)=A (𝑣)
m
ZERO AND IDENTITY MATRIX OPERATION
• A matrix mxn matrix is a zero matrix, corresponding to
𝑚
zero transformation from R ⁿ → R
• A matrix nxn matrix is Identity matrix In, corresponds to
𝑚
zero transformation from R ⁿ → R
• An m x n matrix has m rows and n columns. The
individual numbers are called elements of the matrix.
Elements are denoted by two subscripts, the first
denoting the row, the second the column.
ZERO AND IDENTITY MATRIX OPERATION
ZERO AND IDENTITY MATRIX OPERATION
Example Let’s consider the linear transformation from
R 2 → R 3 such that:
ZERO AND IDENTITY MATRIX OPERATION
ZERO AND IDENTITY MATRIX OPERATION
KERNEL/RANGE SPACE
KERNEL/RANGE SPACE
then ∀𝑣 ∈ 𝑉 such that: T⋅ 𝑣 = 0 is the kernel space of
Kernel Space: Let T: V→W is a linear transformation

T. It is also known as the null space of T.


• The kernel space of zero transformation for T: V→W
is W.
• The kernel space of identity transformation for T:
V→W is {0}. The dimensions of the kernel space are
known as nullity or null(T).
KERNEL/RANGE SPACE
then ∀𝑣 ∈ 𝑉 such that: T⋅ 𝑣 = 𝑣 is the range space of
Range Space: Let T: V→W is a linear transformation

T. • Range space is always a non-empty set for a linear


transformation on a matrix because: T⋅0=0 The
dimensions of the range space are known as rank (T).
• The sum of rank and nullity is the dimension of the
domain: null(T) +rank(T) =dim(V) =n
LINEAR TRANSFORMATION AS ROTATION
LINEAR TRANSFORMATION AS ROTATION
LINEAR TRANSFORMATION AS

PROJECTION
LINEAR TRANSFORMATION AS

PROJECTION

If a vector is given by v = (x, y, z). Then, T⋅𝑣 (𝑥, 𝑦, 0).


That is the orthogonal projection of the original vector.
DIFFERENTIATION AS LINEAR
Let T: P(F)→P(F) P(F)→P(F) be the differentiation
transformation such that: T⋅p(z)=p‘(z).
Then for two polynomials p(z), q(z)ϵP(F) , we have:
TRANSFORMATION
T(p(z)+q(z))=(p(z)+q(z))‘
=p‘(z)+q‘(z)
=T(p(z))+T(q(z))
DIFFERENTIATION AS LINEAR
Similarly, for the scalar ∈∈ 𝐹 we have:
T(a⋅p(z))=(a⋅p(z))‘=ap‘(z)=aT(p(z))
TRANSFORMATION
The above equation proved that differentiation is a
linear transformation.
THANK YOU FOR
LISTENING!!!

You might also like