Lecture 2: Eigenvalues, Eigenvectors & EVD

Lecture 2: Eigenvalues, Eigenvectors & Eigenvalue Decomposition (EVD)

1. Introduction

Eigenvalues and eigenvectors are fundamental concepts in linear algebra that describe how a linear transformation acts on vectors. If we have a matrix A and a vector v, then v is an eigenvector of A if multiplying A by v results in a scaled version of v:

A·v = λ·v

Here, λ is called the eigenvalue, and it tells us how much the eigenvector is stretched or shrunk. Eigenvectors give us special directions where transformations act as simple scalings.

2. Eigenvalue Decomposition (EVD)

For a square matrix A, if it can be decomposed as:

A = VΛV-1

- V is the matrix of eigenvectors (columns are eigenvectors).
- Λ is the diagonal matrix of eigenvalues.
- This decomposition is useful in machine learning, physics, and engineering.

3. Example: 2×2 Matrix

Consider the matrix:

    A = [4   2]
        [1   3]
    

Step 1: Find the characteristic equation

det(A - λI) = 0

    |4-λ   2  |
    | 1   3-λ| = (4-λ)(3-λ) - 2 = λ² - 7λ + 10 = 0
    

Step 2: Solve for eigenvalues

    λ² - 7λ + 10 = 0 → (λ-5)(λ-2) = 0
    → Eigenvalues: λ₁ = 5, λ₂ = 2
    

Step 3: Find eigenvectors

Step 4: Form EVD

    V = [[2  -1],
         [1   1]]

    Λ = [[5  0],
         [0  2]]

    A = VΛV⁻¹
    

4. Applications in Machine Learning

5. Try It Yourself: Eigenvalue & Eigenvector Calculator (2×2 Matrix)

Enter values for a 2×2 matrix: