ML | Face Recognition Using Eigenfaces (PCA Algorithm) Last Updated : 24 Sep, 2021 Comments Improve Suggest changes Like Article Like Report In 1991, Turk and Pentland suggested an approach to face recognition that uses dimensionality reduction and linear algebra concepts to recognize faces. This approach is computationally less expensive and easy to implement and thus used in various applications at that time such as handwritten recognition, lip-reading, medical image analysis, etc. PCA (Principal Component Analysis) is a dimensionality reduction technique that was proposed by Pearson in 1901. It uses Eigenvalues and EigenVectors to reduce dimensionality and project a training sample/data on small feature space. Let's look at the algorithm in more detail (in a face recognition perspective). Training Algorithm: Let's Consider a set of m images of dimension N*N (training images). Training Image with True Label (LFW people's dataset) We first convert these images into vectors of size N2 such that: x_{1},x_{2},x_{3}...x_{m} Now we calculate the average of all these face vectors and subtract it from each vector \psi =\dfrac{1}{m}\sum_{i=1}^{m}x_i \\ a_{i} = x_{i}-\psi average_face Now we take all face vectors so that we get a matrix of size of N2 * M. A = \begin{bmatrix} a_{1} &a_{2} &a_{3} &.... & a_{m} \end{bmatrix} Now, we find Covariance matrix by multiplying A with AT. A has dimensions N2 * M, thus AT has dimensions M * N2. When we multiplied this gives us matrix of N2 * N2, which gives us N2 eigenvectors of N2 size which is not computationally efficient to calculate. So we calculate our covariance matrix by multiplying AT and A. This gives us M * M matrix which has M (assuming M << N2) eigenvectors of size M. Cov = A^{T}A In this step we calculate eigen values and eigenvectors of above covariance matrix using the formula below. A^{T}A\nu_{i} = \lambda_{i}\nu_{i} \\ \\ AA^{T}A\nu_{i} = \lambda_{i}A\nu_{i} \\ \\ C{}'u_{i} = \lambda_{i}u_{i} where, C{}' = AA^{T} and u_{i} = A\nu_{i} From the above statement It can be concluded that C_{}' and C have same eigenvalues and their eigenvectors are related by the equation u_{i} = A\nu_{i}. Thus, the M eigenvalues (and eigenvectors) of covariance matrix gives the M largest eigenvalues(and eigenvectors) of C_{}' Now we calculate Eigenvector and Eigenvalues of this reduced covariance matrix and map them into the C_{}' by using the formula u_{i} = A\nu_{i}. Now we select the K eigenvectors of C_{}' corresponding to the K largest eigenvalues (where K < M). These eigenvectors has size N2. In this step we used the eigenvectors that we got in previous step. We take the normalized training faces (face - average face) x_{i} and represent each face vectors in the linear of combination of the best K eigenvectors (as shown in the diagram below). x_{i} -\psi = \sum_{j=1}^{K} w_{j}u_{j} These u_{j} are called EigenFaces. EigenFaces In this step, we take the coefficient of eigenfaces and represent the training faces in the form of a vector of those coefficients. x_{i} = \begin{bmatrix} w_{1}^i\\ w_{2}^i\\ w_{3}^i\\ .\\ . \\ w_{k}^i \end{bmatrix} Linear Combination of EigenFaces Testing/Detection Algorithm : Test Images With true labels Given an unknown face y, we need to first preprocess the face to make it centered in the image and have the same dimensions as the training face Now, we subtract the face from the average face \psi . \phi = y - \psi Test Images - Average Images Now, we project the normalized vector into eigenspace to obtain the linear combination of eigenfaces. \phi = \sum_{i=1}^{k}w_{i}u_{i} From the above projection, we generate the vector of the coefficient such that \Omega= \begin{bmatrix} w_{1}\\ w_{2}\\ w_{3}\\ .\\ .\\ w_{k} \end{bmatrix} We take the vector generated in the above step and subtract it from the training image to get the minimum distance between the training vectors and testing vectors e_r = min_{l}\left \| \Omega - \Omega_{l}\right \| If this e_r is below tolerance level Tr, then it is recognised with l face from training image else the face is not matched from any faces in training set. Test images With prediction Advantages: Easy to implement and computationally less expensive. No knowledge (such as facial feature) of the image required (except id). Limitations : Proper centered face is required for training/testing. The algorithm is sensitive to lightining, shadows and also scale of face in the image . Front view of the face is required for this algorithm to work properly. Reference : Turk and Pentland's 1991 paper on face recognition Comment More info P pawangfg Follow Improve Article Tags : Machine Learning Image-Processing Explore Machine Learning BasicsIntroduction to Machine Learning8 min readTypes of Machine Learning13 min readWhat is Machine Learning Pipeline?7 min readApplications of Machine Learning3 min readPython for Machine LearningMachine Learning with Python Tutorial5 min readNumPy Tutorial - Python Library3 min readPandas Tutorial6 min readData Preprocessing in Python4 min readEDA - Exploratory Data Analysis in Python6 min readFeature EngineeringWhat is Feature Engineering?5 min readIntroduction to Dimensionality Reduction4 min readFeature Selection Techniques in Machine Learning6 min readSupervised LearningSupervised Machine Learning7 min readLinear Regression in Machine learning15+ min readLogistic Regression in Machine Learning11 min readDecision Tree in Machine Learning9 min readRandom Forest Algorithm in Machine Learning5 min readK-Nearest Neighbor(KNN) Algorithm8 min readSupport Vector Machine (SVM) Algorithm9 min readNaive Bayes Classifiers7 min readUnsupervised LearningWhat is Unsupervised Learning5 min readK means Clustering â Introduction6 min readHierarchical Clustering in Machine Learning6 min readDBSCAN Clustering in ML - Density based clustering6 min readApriori Algorithm6 min readFrequent Pattern Growth Algorithm5 min readECLAT Algorithm - ML3 min readPrincipal Component Analysis(PCA)7 min readModel Evaluation and TuningEvaluation Metrics in Machine Learning9 min readRegularization in Machine Learning5 min readCross Validation in Machine Learning7 min readHyperparameter Tuning7 min readML | Underfitting and Overfitting5 min readBias and Variance in Machine Learning10 min readAdvanced TechniquesReinforcement Learning8 min readSemi-Supervised Learning in ML5 min readSelf-Supervised Learning (SSL)6 min readEnsemble Learning8 min readMachine Learning PracticeTop 50+ Machine Learning Interview Questions and Answers15+ min read100+ Machine Learning Projects with Source Code [2025]6 min read Like