Eigenvalues and eigenvectors are fundamental concepts in linear algebra that describe how a linear transformation acts on vectors. If we have a matrix A and a vector v, then v is an eigenvector of A if multiplying A by v results in a scaled version of v:
A·v = λ·v
Here, λ is called the eigenvalue, and it tells us how much the eigenvector is stretched or shrunk. Eigenvectors give us special directions where transformations act as simple scalings.
For a square matrix A, if it can be decomposed as:
A = VΛV-1
- V is the matrix of eigenvectors (columns are eigenvectors).
- Λ is the diagonal matrix of eigenvalues.
- This decomposition is useful in machine learning, physics, and engineering.
Consider the matrix:
A = [4 2]
[1 3]
det(A - λI) = 0
|4-λ 2 |
| 1 3-λ| = (4-λ)(3-λ) - 2 = λ² - 7λ + 10 = 0
λ² - 7λ + 10 = 0 → (λ-5)(λ-2) = 0
→ Eigenvalues: λ₁ = 5, λ₂ = 2
V = [[2 -1],
[1 1]]
Λ = [[5 0],
[0 2]]
A = VΛV⁻¹
Enter values for a 2×2 matrix: