Eigenspace and Eigenspectrum Values in a Matrix Last Updated : 15 Jan, 2020 Comments Improve Suggest changes Like Article Like Report Prerequisites: Mathematics | Eigen Values and Eigen Vectors Matrix Multiplication Null Space and Nullity of a Matrix For a given matrix A the set of all eigenvectors of A associated with an eigenvalue \lambda spans a subspace, which is called the Eigenspace of A with respect to \lambda and is denoted by E_\lambda. The set of all eigenvalues of A is called Eigenspectrum, or just spectrum, of A. If \lambda is an eigenvalue of A, then the corresponding eigenspace E_\lambda is the solution space of the homogeneous system of linear equations (A-\lambda I)x = 0. Geometrically, the eigenvector corresponding to a non - zero eigenvalue points in a direction that is stretched by the linear mapping. The eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, then the direction of the stretching is flipped. Below are some useful properties of eigenvalues and eigenvectors in addition to the properties which are already listed in the article Mathematics | Eigen Values and Eigen Vectors. A matrix A and its transpose A^{T} possess the same eigenvalues but not necessarily the same eigenvectors. The eigenspace E_\lambda is the null space of A - \lambda I since Ax = \lambda x \Longleftrightarrow Ax - \lambda x = 0 \Longleftrightarrow (A - \lambda I)x = 0 \Longleftrightarrow x\in ker(A - \lambda I) Note: ker stands for Kernel which is another name for null space. Computing Eigenvalues, Eigenvectors, and Eigenspaces: Consider given 2 X 2 matrix: A = \begin{bmatrix} 4 & 2 \\ 1 & 3 \end{bmatrix} Step 1: Characteristic polynomial and Eigenvalues. The characteristic polynomial is given by det(A - \lambda I) = det(\begin{bmatrix} 4 & 2 \\ 1 & 3 \end{bmatrix} - \begin{bmatrix} \lambda & 0 \\ 0 & \lambda \end{bmatrix}) = \begin{vmatrix} 4-\lambda & 2 \\ 1 & 3-\lambda \end{vmatrix} = (4-\lambda)(3-\lambda) - 2.1 After we factorize the characteristic polynomial, we will get (2-\lambda)(5-\lambda) which gives eigenvalues as \lambda_1 = 2 and \lambda_2 = 5 Step 2: Eigenvectors and Eigenspaces We find the eigenvectors that correspond to these eigenvalues by looking at vectors x such that \begin{bmatrix} 4-\lambda & 2 \\ 1 & 3-\lambda \end{bmatrix} % x = 0 For \lambda = 5 we obtain \begin{bmatrix} 4-5 & 2 \\ 1 & 3-5 \end{bmatrix} % \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} = \begin{bmatrix} -1 & 2 \\ 1 & -2 \end{bmatrix} % \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} = 0 After solving the above homogeneous system of equations, we will obtain a solution space E_5 = span(\begin{bmatrix} 2 \\ 1 \end{bmatrix}) This eigenspace is one dimensional as it possesses a single basis vector. Similarly, we find eigenvector for \lambda = 2 by solving the homogeneous system of equations \begin{bmatrix} 4-2 & 2 \\ 1 & 3-2 \end{bmatrix} % \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} = \begin{bmatrix} 2 & 2 \\ 1 & 1 \end{bmatrix} % \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} = 0 This means any vector x = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} , where x_2=-x_1 such as \begin{bmatrix} 1 \\ -1 \end{bmatrix} is an eigenvector with eigenvalue 2. This means eigenspace is given as E_2 = span(\begin{bmatrix} 1 \\ -1 \end{bmatrix}) The two eigenspaces E_5 and E_2 in the above example are one dimensional as they are each spanned by a single vector. However, in other cases, we may have multiple identical eigenvectors and the eigenspaces may have more than one dimension. Comment More infoAdvertise with us Next Article Eigenspace and Eigenspectrum Values in a Matrix mkumarchaudhary06 Follow Improve Article Tags : Mathematical Technical Scripter Machine Learning DSA Technical Scripter 2019 +1 More Practice Tags : Machine LearningMathematical Similar Reads Eigenvalues and Eigenvectors in MATLAB Eigenvalues and Eigenvectors are properties of a square matrix.Let  A =[a_{ij}]_{N*N}  is an N*N matrix, X be a vector of size N*1 and \lambda  be a scalar.Then the values X,\lambda   satisfying the equation  AX=\lambda X  are eigenvectors and eigenvalues of matrix A respectively.A matrix of size 2 min read Power Method - Determine Largest Eigenvalue and Eigenvector in Python The power method is an iterative algorithm that can be used to determine the largest eigenvalue of a square matrix. The algorithm works by starting with a random initial vector, and then iteratively applying the matrix to the vector and normalizing the result to obtain a sequence of improved approxi 6 min read Find the sum of Eigen Values of the given N*N matrix Given an N*N matrix mat[][], the task is to find the sum of Eigen values of the given matrix.Examples: Input: mat[][] = { {2, -1, 0}, {-1, 2, -1}, {0, -1, 2}} Output: 6Input: mat[][] = { {1, 2, 3, 4}, {5, 6, 7, 8}, {9, 10, 11, 12}, {13, 14, 15, 16}} Output: 34 Approach: The sum of Eigen values of a 10 min read Eigenvector Computation and Low-Rank Approximations Prerequisites: Eigen Values and Eigen VectorsBefore getting into the in-depth math behind computations involved with Eigenvectors, let us briefly discuss what an eigenvalue and eigenvector actually are.Eigenvalue and Eigenvectors:The word 'eigen' means 'characteristics'. In general terms, the eigenv 8 min read Matrix Diagonalization Matrix diagonalization is the process of reducing a square matrix into its diagonal form using a similarity transformation. This process is useful because diagonal matrices are easier to work with, especially when raising them to integer powers.Not all matrices are diagonalizable. A matrix is diagon 8 min read What is Matrix in Data Structure? A matrix is a two-dimensional array that consists of rows and columns. It is an arrangement of elements in horizontal or vertical lines of entries.Example of a 4x4 matrixCharacteristics of Matrix:Dimensions: The dimensions of a matrix are given by the number of rows and columns. A matrix with m rows 4 min read Symmetric and Skew Symmetric Matrices Symmetric and Skew Symmetric Matrices are the types of square matrices based on the relation between a matrix and its transpose. These matrices are one of the most used matrices out of all the matrices out there. Symmetric matrices have use cases in optimization, physics, and statistics, whereas ske 10 min read Matrix Data Structure Components, Applications, Advantages and Disadvantages Matrix is a two-dimensional array or table consisting of rows and columns. The intersection of a row and column is called a cell. All the data is stored across different cells in the matrix. Matrix data structure is used when we want to store data in the form of table or grid. Each element in a matr 3 min read Eigenvector Centrality (Centrality Measure) In graph theory, eigenvector centrality (also called eigencentrality) is a measure of the influence of a node in a network. It assigns relative scores to all nodes in the network based on the concept that connections to high-scoring nodes contribute more to the score of the node in question than equ 5 min read Dimensionality Reduction with PCA: Selecting the Largest Eigenvalues and Eigenvectors In data analysis, particularly in multivariate statistics and machine learning, the concepts of eigenvalues and eigenvectors of the covariance matrix play a crucial role. These mathematical constructs are fundamental in techniques such as Principal Component Analysis (PCA), which is widely used for 7 min read Like