gate data science and ai

eigenvalues eigenvectors for gate

Eigenvalues and Eigenvectors for GATE Exam

Module - 2 Linear Algebra, Calculus and Optimization
Eigenvalues and Eigenvectors for GATE Exam

Eigenvalues and eigenvectors are fundamental concepts in linear algebra that play a crucial role in understanding and solving a wide range of problems in various fields. These concepts provide a powerful tool for analyzing linear transformations and matrices. Eigenvalues and eigenvectors have significant importance in disciplines such as linear algebra, physics, computer science, and data analysis.

Eigenvalues and eigenvectors are like the building blocks of linear algebra, enabling us to break down complex operations into simpler components. They allow us to discover essential characteristics of matrices and transformations, leading to insights and solutions in diverse applications.

Definition and Concept

Eigenvalues and eigenvectors are properties of square matrices. Let's delve into their definitions and concepts:

Define Eigenvalues and Eigenvectors:

Eigenvalue (λ): An eigenvalue of a square matrix represents how much an associated eigenvector is stretched or compressed during a linear transformation. It is a scalar value that scales the eigenvector while leaving its direction unchanged.

Eigenvector (v): An eigenvector is a non-zero vector that remains in the same direction after the application of a square matrix. When multiplied by the matrix, it results in a scaled version of itself.

Explain the Concept Using Mathematical Notation: Mathematically, eigenvalues and eigenvectors are expressed as follows:

For a square matrix A, an eigenvector v and its corresponding eigenvalue λ satisfy the equation: Av=λv

Here's a detailed breakdown: A is the square matrix. λ is the eigenvalue. v is the eigenvector.

This equation essentially states that when matrix A acts on vector v, the result is a scaled version of v, with the scaling factor being λ.

Eigenvalues and eigenvectors are specifically associated with square matrices, which have the same number of rows and columns.

Suppose we have a square matrix of size n_n . An eigenvector v is a non-zero vector of size n_1 (a column vector). An eigenvalue λ is a scalar. The equation Av=λv can be written more explicitly for each component:

Screenshot 2023-10-23 at 6.52.02 PM.png

In this system of equations, the matrix A operates on the vector v, resulting in a scaled version of v represented by the eigenvalue λ . Each equation corresponds to one component of the vector.

Finding Eigenvalues and Eigenvectors

Let A be an n×n matrix.

  1. First, find the eigenvalues λ of A by solving the equation det(λI−A)=0.
  2. For each λ, find the basic eigenvectors X≠0 by finding the basic solutions to (λI−A)X=0.

To verify your work, make sure that AX=λX for each λ and associated eigenvector X.

We will explore these steps further in the following example.


  1. Find the eigenvalues and eigenvector for the matrix A

Screenshot 2023-10-23 at 6.52.42 PM.png

First we find the eigenvalues of A by solving the equation det(λI−A)=0

This gives

Screenshot 2023-10-23 at 6.53.48 PM.png

Computing the determinant as usual, the result is Screenshot 2023-10-23 at 6.56.11 PM.png

Solving this equation, we find that

Screenshot 2023-10-23 at 6.56.45 PM.png

Now we need to find the basic eigenvectors for each λ. First we will find the eigenvectors for  

Screenshot 2023-10-23 at 6.58.27 PM.png

We wish to find all vectors X≠0 such that AX=2X. These are the solutions to (2I−A)X=0

Screenshot 2023-10-23 at 6.59.03 PM.png

The augmented matrix for this system and corresponding reduced row-echelon form are given by

Screenshot 2023-10-23 at 6.59.54 PM.png

This is what we wanted, so we know this basic eigenvector is correct.

Properties of Eigenvalues and Eigenvectors:

  1. Non-Uniqueness of Eigenvectors: For a given eigenvalue, there can be multiple linearly independent eigenvectors associated with it. In other words, the eigenvector corresponding to an eigenvalue is not unique; it can be scaled by any nonzero scalar. This means that if v is an eigenvector of a matrix A corresponding to eigenvalue λ, then any scalar multiple cv is also an eigenvector corresponding to the same eigenvalue.
  2. Real or Complex Eigenvalues: Eigenvalues can be either real or complex numbers, depending on the matrix. Real symmetric matrices always have real eigenvalues. However, non-symmetric matrices can have complex eigenvalues. In the case of complex eigenvalues, they come in complex conjugate pairs.
  3. Sum and Product of Eigenvalues:
    • The sum of all eigenvalues of a matrix A is equal to the trace of the matrix, which is the sum of the elements on its main diagonal. Mathematically, ∑_i_=1_nλi_=trace(A), where λi are the eigenvalues.

Screenshot 2023-10-23 at 7.00.53 PM.png

  • The product of all eigenvalues of a matrix A is equal to the determinant of the matrix. Mathematically,

Screenshot 2023-10-23 at 7.01.27 PM.png

These properties are fundamental to the theory of eigenvalues and eigenvectors and have applications in various fields, including linear algebra, physics, engineering, and data analysis. They provide insight into the behavior of matrices and their transformations.

Applications of Eigenvalues and Eigenvectors:

Linear Transformations

Eigenvalues and eigenvectors play a fundamental role in understanding linear transformations. They provide insights into how a transformation affects vectors in space.

Stretching, Compressing, and Shearing Transformations:

  • Eigenvalues represent the scaling factor by which a vector is stretched or compressed along its corresponding eigenvector. If an eigenvalue is greater than 1, it indicates stretching, while an eigenvalue between 0 and 1 implies compression.
  • Eigenvectors represent the direction of stretching or compression. They remain unchanged in direction after the transformation.
  • For example, consider a 2D shear transformation that shifts points along the x-axis. Its matrix may have eigenvalues 1 and 2, indicating that one eigenvector remains unchanged (the y-axis) while the other gets stretched along the x-axis.

Stability Analysis

Eigenvalues are used extensively in stability analysis of dynamic systems in physics, engineering, and control theory. They help determine the stability of equilibrium points in systems of differential equations.

Example: Electrical Circuit Stability:

  • In electrical engineering, eigenvalues are used to analyze the stability of electronic circuits. The circuit equations can be represented as a system of differential equations, and the eigenvalues of the associated matrix provide information about the circuit's stability.

Principal Component Analysis (PCA)

PCA is a technique used for dimensionality reduction and feature extraction in data analysis.

Importance of Eigenvectors in PCA:

  • In PCA, eigenvectors of the covariance matrix represent the principal components of the data.
  • These eigenvectors show the directions along which the data varies the most.
  • By choosing the top eigenvalues and their corresponding eigenvectors, you can reduce the dimensionality of data while preserving as much variance as possible.

Real-World Example: Image Compression:

  • PCA is applied in image compression to reduce the size of images while retaining their essential features. Eigenvectors help identify the most important image components.

Quantum Mechanics

Eigenvalues and eigenvectors are central to quantum mechanics, particularly in understanding observable quantities.

Role of Eigenvectors in Quantum Mechanics:

  • In quantum mechanics, wave functions are represented as vectors, and operators (e.g., position, momentum, energy) are represented as matrices.
  • Eigenvectors of these operators correspond to possible states of a quantum system.
  • Eigenvalues represent the possible values of observable quantities when measured.

Example: Spin Angular Momentum:

  • In the context of electron spin, the spin operators have eigenvectors corresponding to different spin states (e.g., "up" and "down") with associated eigenvalues representing the quantized values of spin angular momentum.

Google PageRank Algorithm

Google's PageRank algorithm uses the concept of eigenvectors to rank web pages based on their importance and relevance.

Significance of Eigenvalues in PageRank:

  • In PageRank, web pages are represented as nodes in a graph, and hyperlinks as edges.
  • The PageRank matrix, representing the probability of moving from one page to another, is a stochastic matrix.
  • The principal eigenvector of this matrix provides the PageRank scores for each page.
  • High PageRank corresponds to important web pages.

Example: Search Engine Ranking:

  • When you perform a web search, Google's algorithm uses eigenvectors to determine the ranking of search results, ensuring that more relevant and reputable pages appear at the top.

These applications illustrate the versatility and importance of eigenvalues and eigenvectors in various fields, from linear algebra to physics, engineering, data analysis, and web search algorithms.


Eigenvalues and eigenvectors are fundamental mathematical concepts with wide-ranging applications across various fields. They provide valuable insights into the behavior of linear transformations, the stability of dynamic systems, dimensionality reduction in data analysis, quantum mechanics, and even web page ranking algorithms like Google's PageRank.

Key Takeaways:

  1. Eigenvalues and Eigenvectors: Eigenvalues represent scaling factors, while eigenvectors represent the direction of stretching or compression in linear transformations.
  2. Linear Transformations: Eigenvalues and eigenvectors help understand how linear transformations affect vectors and provide a basis for analyzing shear, stretch, and compression.
  3. Stability Analysis: Eigenvalues are crucial for stability analysis in dynamic systems, allowing engineers and physicists to determine the stability of equilibrium points.
  4. Principal Component Analysis (PCA): Eigenvectors play a vital role in dimensionality reduction, feature extraction, and variance preservation in data analysis.
  5. Google PageRank Algorithm: Eigenvalues and eigenvectors are central to web page ranking, ensuring that relevant and reputable pages are displayed prominently in search results.
  6. Versatile Mathematical Concepts: Eigenvalues and eigenvectors are versatile mathematical tools used across disciplines to solve complex problems and gain insights into various phenomena.

Eigenvalues and eigenvectors are powerful mathematical tools that offer deep insights into the behavior of linear systems and have a profound impact on a wide range of scientific and technological advancements.

Practice Questions

Question 1:

For a square matrix A, an eigenvector v and its corresponding eigenvalue λ satisfy which of the following equations?

A) Av = λv 

B) Av = v/λ 

C) A/v = λ 

D) Aλ = v

Answer: A) Av = λv

Question 2:

What is the fundamental property of an eigenvector?

A) It always has a length of 1. 

B) It remains in the same direction after a linear transformation. 

C) It is unique for each eigenvalue. 

D) It is always orthogonal to all other eigenvectors.

Answer: B) It remains in the same direction after a linear transformation.

Question 3:

Which of the following statements is true regarding eigenvalues and eigenvectors?

A) Eigenvalues can be complex, while eigenvectors are always real. 

B) Eigenvectors can be complex, while eigenvalues are always real. 

C) Both eigenvalues and eigenvectors are always real. 

D) Both eigenvalues and eigenvectors can be complex.

Answer: D) Both eigenvalues and eigenvectors can be complex.

Question 4:

What is the sum of all eigenvalues of a matrix A equal to?

A) The determinant of A 

B) The transpose of A 

C) The trace of A 

D) The inverse of A

Answer: C) The trace of A

Question 5:

In the context of principal component analysis (PCA), what do the eigenvectors of the covariance matrix represent?

A) The variance of the data 

B) The principal components of the data 

C) The mean of the data 

D) The data points themselves

Answer: B) The principal components of the data

Recommended Courses
Masters in CS: Data Science and Artificial Intelligence
20,000 people are doing this course
Join India's only Pay after placement Master's degree in Data Science. Get an assured job of 5 LPA and above. Accredited by ECTS and globally recognised in EU, US, Canada and 60+ countries.
Certification in Full Stack Data Science and AI
20,000 people are doing this course
Become a job-ready Data Science professional in 30 weeks. Join the largest tech community in India. Pay only after you get a job above 5 LPA.

AlmaBetter’s curriculum is the best curriculum available online. AlmaBetter’s program is engaging, comprehensive, and student-centered. If you are honestly interested in Data Science, you cannot ask for a better platform than AlmaBetter.

Kamya Malhotra
Statistical Analyst
Fast forward your career in tech with AlmaBetter

Vikash SrivastavaCo-founder & CPTO AlmaBetter

Vikas CTO

Related Tutorials to watch

Top Articles toRead

Made with heartin Bengaluru, India
  • Official Address
  • 4th floor, 133/2, Janardhan Towers, Residency Road, Bengaluru, Karnataka, 560025
  • Communication Address
  • 4th floor, 315 Work Avenue, Siddhivinayak Tower, 152, 1st Cross Rd., 1st Block, Koramangala, Bengaluru, Karnataka, 560034
  • Follow Us
  • facebookinstagramlinkedintwitteryoutubetelegram

© 2023 AlmaBetter