Eigenvalues

Overview

Eigenvalues and eigenvectors are fundamental concepts in linear algebra, providing deep insights into the properties of linear transformations and matrices. They are crucial in many areas of science and engineering, including stability analysis, quantum mechanics, facial recognition, and dimensionality reduction techniques like Principal Component Analysis (PCA).

An eigenvector of a square matrix is a non-zero vector that, when the matrix is multiplied by it, yields a scalar multiple of the original vector. This scalar is known as the eigenvalue corresponding to that eigenvector.

Core Concepts

  • Definition

    Let \( A \) be an \( n x n \) square matrix. A non-zero vector \( \mathbf{v} \) is an \( \mathbf{v} \) of \( A \) if there exists a scalar \( \lambda \) (lambda) such that:

    The scalar \( \lambda \) is called the \( \lambda \) associated with the eigenvector \( \mathbf{v} \).

    This equation means that the linear transformation represented by matrix \( A \) only stretches or shrinks the eigenvector \( \mathbf{v} \) by the factor \( \lambda \), without changing its direction (unless \( \lambda \) is negative, in which case the direction is reversed).

    $$A\mathbf{v} = \lambda\mathbf{v}$$
  • Finding Eigenvalues and Eigenvectors

    To find eigenvalues and eigenvectors, we rewrite the defining equation as:

    Where \( I \) is the \( n x n \) identity matrix and \( \mathbf{0} \) is the zero vector.

    For a non-zero vector \( \mathbf{v} \) to be a solution, the matrix \( (A - \lambda I) \) must be singular, which means its determinant must be zero:

    This equation is called the \( \det(A - \lambda I) = 0 \) (characteristic equation) (or characteristic polynomial) of matrix \( A \). Solving this equation for \( \lambda \) yields the eigenvalues.

    Once an eigenvalue \( \lambda \) is found, it is substituted back into \( (A - \lambda I)\mathbf{v} = \mathbf{0} \), and this system of linear equations is solved to find the corresponding eigenvector(s) \( \mathbf{v} \). Note that if \( \mathbf{v} \) is an eigenvector, then any non-zero scalar multiple of \( \mathbf{v} \) is also an eigenvector for the same eigenvalue.

    $$\det(A - \lambda I) = 0$$
  • Properties of Eigenvalues and Eigenvectors

    • A square matrix \( A \) of size \( n x n \) can have at most \( n \) distinct eigenvalues.
    • Eigenvectors corresponding to distinct eigenvalues are linearly independent.
    • The sum of the eigenvalues of a matrix is equal to its trace (the sum of the diagonal elements).
    • The product of the eigenvalues of a matrix is equal to its determinant.
    • Symmetric matrices always have real eigenvalues, and their eigenvectors corresponding to distinct eigenvalues are orthogonal.
    • If a matrix is triangular (upper or lower), its eigenvalues are the entries on its main diagonal.
  • Concept

    If an \( n x n \) matrix \( A \) has \( n \) linearly independent eigenvectors, then it can be factorized in a process called \( \text{eigendecomposition} \) (or spectral decomposition).

    The decomposition is given by:

    Where:

    • \( Q \) is an \( n x n \) matrix whose columns are the eigenvectors of \( A \).
    • \( \Lambda \) (Lambda) is an \( n x n \) diagonal matrix whose diagonal elements are the corresponding eigenvalues of \( A \) (i.e., \( \Lambda_{ii} = \lambda_i \)).

    If \( A \) is a symmetric matrix, its eigenvectors are orthogonal (or can be chosen to be orthogonal). In this case, \( Q \) becomes an orthogonal matrix, meaning \( Q^{-1} = Q^T \) (its inverse is its transpose). The eigendecomposition for a symmetric matrix is then:

    A = Q\Lambda Q^{-1} \\ A = Q\Lambda Q^T
  • Significance

    Eigendecomposition is powerful because it decouples the matrix transformation into simpler components:

    • \( Q^{-1} \) transforms vectors into a basis formed by the eigenvectors.
    • \( \Lambda \) scales these components along the eigenvector directions.
    • \( Q \) transforms the scaled components back to the original basis.

    This is very useful for understanding the behavior of \( A^k \) (matrix powers), solving systems of linear differential equations, and in statistical methods like PCA.

  • Principal Component Analysis (PCA)

    In PCA, eigendecomposition of the covariance matrix of a dataset is used to find the principal components. Eigenvectors represent the directions of maximum variance, and eigenvalues represent the magnitude of this variance.

  • Quantum Mechanics

    In quantum mechanics, physical observables are represented by operators (matrices), and their eigenvalues represent the possible measurable values of these observables. The eigenvectors (eigenstates) represent the state of the system corresponding to those values.

  • Stability Analysis of Dynamical Systems

    Eigenvalues of a system's matrix determine the stability of equilibrium points in linear dynamical systems. For example, in continuous systems, if all eigenvalues have negative real parts, the system is stable.

  • Graph Theory

    Eigenvalues of graph matrices (like the adjacency matrix or Laplacian matrix) provide information about the structural properties of graphs, used in spectral clustering and community detection.

Implementation

  • Finding Eigenvalues/Eigenvectors with NumPy

    
    import numpy as np
    
    # Define a square matrix
    A = np.array([[4, -2],
                  [1,  1]])
    
    print("Matrix A:\n", A)
    
    # Compute eigenvalues and eigenvectors
    eigenvalues, eigenvectors = np.linalg.eig(A)
    
    print("\nEigenvalues:\n", eigenvalues)
    print("\nEigenvectors (each column is an eigenvector):\n", eigenvectors)
    
    # Verify A*v = lambda*v for the first eigenvalue and eigenvector
    lambda1 = eigenvalues[0]
    v1 = eigenvectors[:, 0] # First column
    
    print("\nVerification for first eigenpair:")
    print("A @ v1 (A * v1):\n", A @ v1)
    print("lambda1 * v1:\n", lambda1 * v1)
    
    # Eigendecomposition for a symmetric matrix
    B = np.array([[3, 1],
                  [1, 2]])
    print("\nSymmetric Matrix B:\n", B)
    eig_vals_B, eig_vecs_B = np.linalg.eig(B)
    
    # Q is the matrix of eigenvectors
    Q = eig_vecs_B
    # Lambda is the diagonal matrix of eigenvalues
    Lambda = np.diag(eig_vals_B)
    # For a symmetric matrix Q_inv is Q.T
    Q_inv = Q.T # Or np.linalg.inv(Q)
    
    # Reconstruct B: B = Q @ Lambda @ Q_inv
    B_reconstructed = Q @ Lambda @ Q_inv
    print("\nReconstructed B from Q @ Lambda @ Q.T:\n", B_reconstructed)
    print("Original B and Reconstructed B are close:", np.allclose(B, B_reconstructed))
                            

Interview Examples

What is the geometric interpretation of an eigenvector?

How does a linear transformation affect its eigenvectors?

Can a matrix have an eigenvalue of zero? What does it signify?

Discuss the implications of a zero eigenvalue.

Practice Questions

1. How would you implement this in a production environment? Hard

Hint: Consider scalability and efficiency

2. What are the practical applications of Eigenvalues? Medium

Hint: Consider both academic and industry use cases

3. Explain the core concepts of Eigenvalues Easy

Hint: Think about the fundamental principles