Why & When to use Eigenvalues & Eigenvectors?

0

In this post, you will learn about why and when you need to use Eigenvalues and Eigenvectors? As a data scientist / machine learning Engineer, one must need to have a good understanding of concepts related to Eigenvalues and Eigenvectors as these concepts are used in one of the most popular dimensionality reduction technique – Principal Component Analysis (PCA). In PCA, these concepts help in reducing the dimensionality of the data (curse of dimensionality) resulting in the simpler model which is computationally efficient and provides greater generalization accuracy.   In this post, the following topics will be covered:

  • Background – Why need Eigenvalues & Eigenvectors?
  • What are Eigenvalues & Eigenvectors?
  • When to use Eigenvalues & Eigenvectors?

Background – Why use Eigenvalues & Eigenvectors?

In simple words, the concept of Eigenvectors and Eigenvalues are used to determine a set of important variables (in form of vector) along with scale along different dimensions (key dimensions based on variance) for analysing the data in a better manner. Let’s take a look at the following picture:

Tiger explained using Eigenvectors & Eigenvalues
Fig: Tiger explained using Eigenvectors & Eigenvalues

When you look at the above picture (data) and identify it as tiger, what are some of the key information (dimensions / principal components) you use to call it out as tiger? Is it not body, face, legs etc information? These principal components / dimensions can be seen as eigenvector with each one of them having their own elements. For example, body will have elements such as color, built, shape etc. Face will have elements such as nose, eyes, color etc. The overall data (image) can be seen as transformation matrix. The data (transformatio matrix) when acted on the eigenvectors (principal components) will result in the eigenvectors multiplied by scale factor (eigenvalue). And, accordingly, you can identify the image as the tiger.

The solution to real-world problems often depends upon processing large volume of data representing different variables or dimensions. For example, take the problem of predicting the stock prices. This is a machine learning / predictive analytics problem. Here the dependent value is stock price and there are a large number of independent variables on which the stock price depends. Using large number of independent variables (also called features), training one or more machine learning models for predicting the stock price will be computationally intensive. Such models turn out to be complex models. 

Can we use the information stored in these variables and extract a smaller set of variables (features) to train the models and do the prediction while ensuring that most of the information contained in the original variables is retained / maintained. This will result in simpler and computationally efficient modelsThis is where eigenvalues and eigenvectors comes into picture. 

Feature extraction algorithms such as Principal component analysis (PCA) depend on the concepts of Eigenvalues and Eigenvectors to reduce the dimensionality of data (features) or compress the data (data compression) in form of principal components while retaining most of  the original information. In PCA, the eigenvalues and eigenvectors of features covariance matrix are found and further processed to determine top k eigenvectors based on the corresponding eigenvalues. Thereafter, the projection matrix are created from these eigenvectors which are further used to transform the original features into another feature subspace. With smaller set of features, one or more computationally efficient models can be trained with the reduced generalization error. Thus, it can be said that Eigenvalues and Eigenvectors concepts are key to training computationally efficient and high performing machine learning models. Data scientists must understand these concepts very well.

Finding Eigenvalues and Eigenvectors of a matrix can be useful for solving problems in several fields such as some of the following wherever there is a need for transforming large volume of multi-dimensional data into another subspace comprising of smaller dimensions while retaining most information stored in original data. The primary goal is to achieve optimal computational efficiency. 

  • Machine learning (dimensionality reduction / PCA, facial recognition)
  • Designing communication systems
  • Designing bridges (vibration analysis, stability analysis)
  • Quantum computing
  • Electrical & mechanical engineering
  • Determining oil reserves by oil companies
  • Construction design
  • Stability of the system

What are Eigenvalues & Eigenvectors?

Eigenvectors are the vectors which when multiplied by a matrix (linear combination or transformation) results in another vector having same direction but scaled (hence scaler multiple) in forward or reverse direction by a magnitude of the scaler multiple which can be termed as Eigenvalue. In simpler words, eigenvalue can be seen as the scaling factor for eigenvectors. Here is the formula for what is called eigenequation.

\( Ax = \lambda x
\)

In the above equation, the matrix A acts on the vector x and the outcome is another vector Ax having same direction as original vector x but scaled / shrunk in forward or reverse direction by a magnitude of scaler multiple, \(\lambda\). The vector x is called as eigenvector of A and \(\lambda\) is called its eigenvalue. Let’s understand what pictorially what happens when a matrix A acts on a vector x. Note that the new vector Ax has different direction than vector x.

Matrix multiplication with vector x
Fig 1. Matrix A acts on x resulting in another vector Ax

When the matrix multiplication with vector results in another vector in the same / opposite direction but scaled in forward / reverse direction by a magnitude of scaler multiple or eigenvalue (\(\lambda\)), then the vector is called as eigenvector of that matrix. Here is the diagram representing the eigenvector x of matrix A because the vector Ax is in the same / opposite direction of x.

Fig 2. x is eigenvector of A

Here is further information on the value of eigenvalues:

Eigenvalues

Many disciplines traditionally represent vectors as matrices with a single column rather than as matrices with a single row. For that reason, the word “eigenvector” in the context of matrices almost always refers to a right eigenvector, namely a column vector.

How to Calculate Eigenvector & Eigenvalue?

Here are the steps to calculate the eigenvalue and eigenvector of any matrix A. 

  • Calculate one or more eigenvalues depending upon number of dimensions of square matrix
  • Determine the corresponding eigenvectors

For calculating the eigenvalues, one needs to solve the following equation:

\( Ax = \lambda x
\\Ax – \lambda x = 0
\\(A – \lambda I)x = 0
\)

For non-zero eigenvector, the eigenvalues can be determined by solving the following equation:

\( A – \lambda I = 0
\)

In above equation, I is identity matrix and \(\lambda\) is eigenvalue. Once eigenvalues are determined, eigenvectors are determined by solving the equation \((A – \lambda I)x = 0\)

When to use Eigenvalues & Eigenvectors?

Whenever there is a complex system having large number of dimensions with a large number of data, eigenvectors and eigenvalues concepts help in transforming the data in a set of most important dimensions (principal components). This will result in processing the data in a faster manner.

References

Conclusions

Here are some learnings from this post:

  • Eigenvector is a vector which when multiplied with a transformation matrix results in another vector multiplied with a scaler multiple having same direction as Eigenvector. This scaler multiple is known as Eigenvalue
  • Eigenvectors and Eigenvalues are key concepts used in feature extraction techniques such as Principal Component analysis which is an algorithm used to reducing dimensionality while training a machine learning model.
  • Eigenvalues and Eigenvector concepts are used in several fields including machine learning, quantum computing, communication system design, construction designs, electrical and mechanical engineering etc.
Ajitesh Kumar
Share.

Leave A Reply

Time limit is exhausted. Please reload the CAPTCHA.