Linear Algebra Formula Sheet
1. Vectors & Basic Operations
- Vector Addition: \vec{u} + \vec{v} = (u_1 + v_1, \dots, u_n + v_n)
- Scalar Multiplication: \alpha \vec{v} = (\alpha v_1, \dots, \alpha v_n)
- Dot Product: \vec{u} \cdot \vec{v} = \sum u_i v_i = \|\vec{u}\| \|\vec{v}\| \cos\theta
- Cross Product (3D): \vec{u} \times \vec{v} = (u_2v_3 - u_3v_2, \dots)
- Magnitude (Norm): \|\vec{v}\| = \sqrt{\vec{v} \cdot \vec{v}}
- Angle between vectors: \cos \theta = \frac{\vec{u} \cdot \vec{v}}{\|\vec{u}\|\|\vec{v}\|}
- Projection: \mathrm{proj}_{\vec{u}}\vec{v} = \left( \frac{\vec{u} \cdot \vec{v}}{\|\vec{u}\|^2} \right) \vec{u}
2. Matrices
- Matrix Addition: A + B = [a_{ij} + b_{ij}]
- Scalar Multiplication: \alpha A = [\alpha a_{ij}]
- Matrix Product: AB = [\sum_k a_{ik}b_{kj}]
- Transpose: (A^T)_{ij} = A_{ji}
- Identity Matrix: I_n \text{ such that } AI = IA = A
- Diagonal Matrix: \text{Only } a_{ii} \neq 0
- Trace: \mathrm{tr}(A) = \sum_i a_{ii}
- Symmetric Matrix: A = A^T
- Skew-Symmetric: A = -A^T
3. Determinants
- 2x2 Matrix: \det \begin{bmatrix} a & b \\ c & d \end{bmatrix} = ad - bc
- 3x3 Matrix: Rule of Sarrus or cofactor expansion
- Properties: \det(AB) = \det(A)\det(B), \det(A^T) = \det(A), A \text{ singular } \Rightarrow \det(A) = 0
- Cofactor: C_{ij} = (-1)^{i+j} \det(M_{ij})
- Adjugate: \text{adj}(A) = C^T
4. Inverses
- Inverse (2x2): A^{-1} = \frac{1}{\det(A)} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}
- General: A^{-1} = \frac{1}{\det(A)} \cdot \text{adj}(A)
- Property: A A^{-1} = A^{-1} A = I
5. Systems of Linear Equations
- Matrix Form: A\vec{x} = \vec{b}
- Gaussian Elimination: Row reduce to echelon form
- Gauss-Jordan Elimination: Reduce to RREF
- Cramer's Rule: x_i = \frac{\det(A_i)}{\det(A)} \text{ if } \det(A) \neq 0
- Consistency: \text{rank}(A) = \text{rank}([A | \vec{b}])
- Unique Solution: \text{rank}(A) = n
6. Vector Spaces
Linear Algebra Formula Sheet
- Span: Set of all linear combinations
- Linear Independence: c_1 \vec{v}_1 + \dots + c_n \vec{v}_n = 0 \Rightarrow c_i = 0
- Basis: Linearly independent set spanning a space
- Dimension: Number of vectors in a basis
- Null Space: \{ \vec{x} : A\vec{x} = 0 \}
- Column Space: \text{Span of columns of } A
- Rank: Dimension of column space
- Rank-Nullity Theorem: \text{rank}(A) + \text{nullity}(A) = n
7. Linear Transformations
- T is Linear: T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v}), T(c\vec{v}) = cT(\vec{v})
- Matrix of T: T(\vec{x}) = A\vec{x}
- Kernel: \ker(T) = \{ \vec{x} : T(\vec{x}) = 0 \}
- Image: \text{Im}(T) = \{ T(\vec{x}) \}
8. Eigenvalues & Eigenvectors
- Characteristic Polynomial: \det(A - \lambda I) = 0
- Eigenvalue lambda: A\vec{v} = \lambda \vec{v}
- Eigenvector: A\vec{v} = \lambda \vec{v}, \vec{v} \ne 0
- Diagonalization: A = PDP^{-1}
- Power of A: A^k = PD^kP^{-1}
9. Orthogonality & Inner Product Spaces
- Inner Product: \langle \vec{u}, \vec{v} \rangle = \vec{u}^T \vec{v}
- Orthogonal Vectors: \langle \vec{u}, \vec{v} \rangle = 0
- Orthonormal Set: Orthogonal + Unit Norm
- Gram-Schmidt: Converts basis into orthonormal basis
- Projection: \text{proj}_W(\vec{v}) = A(A^TA)^{-1}A^T\vec{v}
10. Special Matrices & Advanced Concepts
- Orthogonal Matrix: Q^T = Q^{-1} \Rightarrow Q^T Q = I
- Symmetric Matrix: A = A^T
- Positive Definite: \vec{x}^T A \vec{x} > 0 \ \forall \vec{x} \neq 0
- SVD: A = U \Sigma V^T
- Pseudoinverse: A^+ = (A^TA)^{-1}A^T
- Jordan Form: Decomposes A into Jordan blocks
- Spectral Theorem: Symmetric A => diagonalizable with orthogonal matrix