0% found this document useful (0 votes)
263 views8 pages

Linear Algebra Question Bank 2025-26

The document is a question bank for the Applied Linear Algebra course (MAT3002) for the Fall Semester 2025-26. It covers various topics including systems of linear equations, column space, linear independence, linear transformations, orthogonal projections, QR decomposition, eigenvalues and eigenvectors, and null space conditions. Each section contains multiple problems that require solving or verifying mathematical concepts related to linear algebra.

Uploaded by

ranjith
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
263 views8 pages

Linear Algebra Question Bank 2025-26

The document is a question bank for the Applied Linear Algebra course (MAT3002) for the Fall Semester 2025-26. It covers various topics including systems of linear equations, column space, linear independence, linear transformations, orthogonal projections, QR decomposition, eigenvalues and eigenvectors, and null space conditions. Each section contains multiple problems that require solving or verifying mathematical concepts related to linear algebra.

Uploaded by

ranjith
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Question Bank

Applied Linear Algebra (MAT3002)


Fall Semester 2025–26

1. Systems of Linear Equations

1.1. Find the general solution of the underdetermined system:






 x + y + z + t = 6,

 2x − y + 3z + 2t = 14,


x − 3y + 2z − t = −2.

1.2. Determine the rank and a parametric form of all solutions:






 x + 2y + 3z + t = 4,

 2x + 5y + 3z − t = 7,


x + y + z + 2t = 2.

1.3. For the parameter k, discuss consistency and solve (if consistent):




 x + 2y + z + t = 1,

 2x + 4y + kz + (k − 2)t = 2,


x + 3y + (k − 1)z + 2t = 1.

1.4. Reduce the augmented matrix to RREF and obtain the general solution:
 
1 −1 2 0 3
 
1 −1 .
 2 1 0 

3 0 1 −2 5

1
1.5. For parameter k, solve and discuss consistency:




 x + 2y + z = 1,

 2x + 4y + kz = 2,


x + 3y + (k − 1)z = 1.

2. Column Space and Dependence


 
1 0 2
 
2.1. Is (2, 1, 5) a linear combination of the columns of A = 
3 −1 1?

2 4 0

2.2. Determine whether {(1, 2, 3), (2, 4, 6), (3, 5, 8)} is dependent; find the nontrivial
relation.
 
1 2 1
 
2.3. For A = 
 0 1 2, find bases for Col(A) and Nul(A).

2 5 3

 
3 −1 2
 
Check if 1, 23 , 53 lies in the column space of A = 

2.4. 2 1 3 .

7 1 8

2.5. Let u = (1, 0, 1), v = (0, 1, 1), w = (1, 1, 2). Test if w ∈ span{u, v}.

3. Linear Independence and Basis

3.1. Verify whether B1 = {(x − 1), (x − 1)2 , (x − 1)3 } is linearly independent. If yes,
does it form a basis for S1 = { p ∈ P3 : p(1) = 0 }?

3.2. Verify whether B2 = {x, x2 + x, x3 − 2x} is linearly independent. If yes, does it


form a basis for S2 = { p ∈ P3 : p(0) = 0 }?

3.3. Verify whether B3 = {1 + x2 , 1 − x3 , x2 + x3 } is linearly independent. If yes, does


it form a basis for S3 = { a + bx + cx2 + dx3 ∈ P3 : b = 0 } (i.e., no x-term)?

2
3.4. Verify whether B4 = {1 − x, x2 − x, x3 − x} is linearly independent. If yes, does
it form a basis for S4 = { p ∈ P3 : p(1) = 0 }?

3.5. Verify whether B5 = {2−x+3x2 , 1+2x−x2 , x+x2 +x3 } is linearly independent.


If yes, does it form a basis for S5 = { a + bx + cx2 + dx3 ∈ P3 : a + b + c + d = 0 }?

4. Linear Transformations and Change of Basis

4.1. Let T : P3 → P2 be defined by

T a + bx + cx2 + dx3 = a x2 + b x.


Find the matrix of T with respect to

B3 = { 1 + x3 , 1 + x + x2 , 1 + x2 + x3 , x } and B2 = { 1 − x + x2 , x, x2 }.

4.2. T : P2 → P2 , T (p) = p′ . Find the matrix in B = {1, x, x2 }.

4.3. T : R2 → R2 reflection about y = x. Find the matrix and verify T 2 = I.

4.4. T : R3 → R3 , rotation by θ about the z-axis. Write the standard matrix.

4.5. Find the change-of-basis matrix from B = {1, x, 1 + x} to C = {1, x, x2 } in P2


(interpret 1 + x appropriately).

5. Coordinate Vectors

5.1. Express p(x) = 2 − 3x + x2 in B = {1 + x, x + x2 , x2 }.

5.2. Coordinates of p(x) = x2 − 2x + 1 in B = {x2 , x − 1, 1 + x2 }.

5.3. For p(x) = 1 + 2x + 3x2 , find [p]B where B = {1, 1 + x, x2 }.

5.4. Find the transition matrix PB→C for B = {1, x, x2 }, C = {1 − x, 1 + x, x2 }, and


compute [p]C for p(x) = x.

3
6. Orthogonal Projection

6.1. Project v = (3, −1, 2) onto span{(1, 0, 1), (0, 1, 1)}; write v = u + w with u ⊥ w.

6.2. Project v = (4, 1, −1) onto the line span{(2, −1, 2)}; write v = u + w with u ⊥ w.

6.3. Project v = (3, −2, 1) onto span{(1, 0, 1), (0, 1, 1)}.

6.4. Project v = (4, 1, −1) onto the line through a = (2, −1, 2) and decompose.

6.5. Find the distance from v = (1, 2, 3) to the plane x + y + z = 0 via orthogonal
projection.

6.6. Orthogonally project (1, 0, 1, 0) onto span{(1, 1, 0, 0), (0, 1, 1, 0)} in R4 .

7. Gram–Schmidt Orthogonalization

7.1. Apply Gram–Schmidt to {(1, 1, 0), (1, 0, 1), (0, 1, 1)}.

7.2. Apply Gram–Schmidt to {1, x, x2 } on [−1, 1].

R1
7.3. Apply Gram–Schmidt to {1, t, t2 } with ⟨f, g⟩ = 0
f (x)g(x) dx.

7.4. Orthonormalize {(1, 1, 1), (1, 0, 0), (0, 1, 0)} in R3 .

 
1 1
 
7.5. Use Gram–Schmidt to produce Q in the QR of A = 
1 0.

0 1

8. Range and Null Space Orthogonality


 
1 2 −1
 
8.1. For B = 2 4 −2, compute bases for Ran(B T ) and ker(B) (via RREF), and
 
 
0 1 3
verify that every vector in Ran(B T ) is orthogonal to every vector in ker(B); hence
Ran(B T ) ⊥ ker(B).

4
 
1 1 11
8.2. For C =  , find bases for Ran(C T ) and ker(C) and check their
2 0 −1 3
orthogonality by dot products. Conclude that Ran(C T ) ⊥ ker(C).

 
0 −2 5
  T
8.3. For A = 
1 4 , show Ran(A ) ⊥ ker(A).
−7
3 −1 6

8.4. Prove ker(AT ) = (Ran(A))⊥ for any matrix A.

 
1 2 3
8.5. For A =  , find orthonormal bases of Ran(A) and ker(AT ).
4 5 6

8.6. Show ker(A) ⊥ Ran(AT ) for A ∈ Rm×n via the inner-product identity.

 
1 1 1
 
8.7. Compute dim ker(A) and dim Ran(A) for A = 
1 1 1 and verify rank–nullity.

1 1 1

9. QR Decomposition
 
2 −1 0
 
0 2 1
9.1. QR of A =  .
 
1 0 2
 
2 1 1

 
2 1 0
 
9.2. QR of B = 
1 2 1.

0 1 2

 
1 1
 
9.3. QR of 
1 −1.

1 1

5
 
2 −1
 
9.4. QR of 
2 1.
1 0

   
1 1 1
   
9.5. Use QR to solve least squares for A = 
1 2 ,
 b=
2.

1 3 2

9.6. Show that if A has independent columns then Q in A = QR has orthonormal


columns and R is invertible upper triangular.

10. Eigenvalues and Eigenvectors


 
2 1
10.1. Eigenpairs of A =   and B = A − 2I; relate spectra and comment on
0 2
algebraic vs. geometric multiplicity.
 
3 −1
10.2. Let A =  . Compute eigenpairs of A and B = A − 3I; explain the
c 3
spectral shift and whether eigenvectors change.
 
2 0
10.3. Eigenpairs of   and of B = A − I; comment on the eigenvalue shift.
0 3

 
4 1
10.4. For A =  , compute P DP −1 .
2 3



0 1 0
 
10.5. Find eigenvalues of A = 
 0 0 1 and discuss diagonalizability.

0 0 0

10.6. If A is symmetric, prove eigenvectors for distinct eigenvalues are orthogonal; il-
lustrate with a 3 × 3 example.

6
11. Null Space Conditions
 
1 2 1
 
11.1. For A =  0 1 2, find all x with Ax ⊥ Col(A).
 
1 1 1

 
1 1 1
  T
11.2. For A = 
1 −1 1 , solve A Ax = 0.
1 1 −1

11.3. Describe ker(AT A) in terms of ker(A); prove equality.

 
1 2
 
11.4. For A =  2 4 , compute ker(A) and ker(AT ).
 
3 6

12. Matrix Representation of Linear Operators


Rx
12.1. T : P2 → P3 , T (p) = 0
p(t) dt. Find the matrix in standard bases.

12.2. D : P3 → P2 , D(p) = p′ (x). Find the matrix in standard bases.

12.3. S : P2 → P2 , S(p) = x p(x). Find the matrix in B = {1, x, x2 }.

12.4. Find the composition matrix for D ◦ T : P2 → P2 where T is in (2) and D is


differentiation.

13. Inner Product and Angles Between Functions

13.1. Angle between f (x) = x and g(x) = x2 over [0, 1].

13.2. Compute cos θ between f (x) = ex and g(x) = 1 + x on [0, 1].

R1
13.3. With weight w(x) = 1 + x2 on [−1, 1], compute ⟨x, 1 − x⟩w = −1
x(1 − x)w(x) dx
and the angle.

7
13.4. Orthogonalize {1, x} on [0, 2] and normalize under the L2 inner product.

14. Least Squares Approximation






 x + y = 2,

14.1. Solve 2x + 3y = 5, in the least-squares sense.



3x + 4y = 6

   
1 1 1
   
14.2. Least squares for A = 
 1 2 ,
 b=
2 (via normal equations).

1 3 2

14.3. Fit y ≈ α + βx to points (0, 1), (1, 2), (2, 2) and report (α̂, β̂).

14.4. Show that the least-squares solution x̂ satisfies AT (Ax̂ − b) = 0; verify on a 3 × 2


example.

You might also like