Lecture 3 — Row Reduced Echelon Form (RREF)

Lecture 9- Qudratic Forms

1. Definitions & Basic Properties

A quadratic form on \(\mathbb{R}^n\) is a homogeneous degree-2 polynomial in \(n\) variables:

\[ Q(x) = Q(x_1,\dots,x_n) = \sum_{i,j=1}^n a_{ij} x_i x_j. \]

Every quadratic form can be written using a matrix:

\[ Q(x) = x^T A x, \quad x = \begin{pmatrix}x_1\\\vdots\\x_n\end{pmatrix}, \]

where \(A\) is an \(n\times n\) matrix. Note: \(x^TAx = x^T\left(\frac{A+A^T}{2}\right)x\), so we may (and usually will) assume \(A\) is symmetric.

2. Symmetric Matrices & Associated Bilinear Form

If \(A\) is symmetric, define the bilinear form \(B(x,y)=x^T A y\). Then \(Q(x)=B(x,x)\). Key facts:

  • Eigenvalues of a real symmetric matrix are real.
  • There exists an orthogonal matrix \(P\) with \(P^T A P = D\) diagonal (spectral theorem).
  • Diagonal entries of \(D\) are eigenvalues of \(A\).

3. Diagonalization & Principal Axes Theorem

Principal Axes Theorem: For a real symmetric matrix \(A\), there is an orthogonal change of variables \(x = P y\) (so \(P^TP=I\)) such that:

\[ Q(x) = x^T A x = y^T D y = \lambda_1 y_1^2 + \dots + \lambda_n y_n^2, \]

where \(D = \operatorname{diag}(\lambda_1,\dots,\lambda_n)\) and the \(\lambda_i\) are eigenvalues of \(A\). So quadratic forms are reduced to sums of scaled squares in an orthonormal basis.

How to diagonalize in practice:

  1. Find symmetric matrix \(A\) for the form.
  2. Compute eigenvalues and orthonormal eigenvectors of \(A\).
  3. Form orthogonal matrix \(P\) with eigenvectors as columns; then \(P^T A P = D\).

4. Sylvester's Law of Inertia

Sylvester's law states that under any real non-singular change of variables \(x = S y\) (with \(S\) invertible, not necessarily orthogonal), the numbers of positive, negative and zero coefficients in the diagonalized form (the inertia) are invariant. These counts are called the inertia or the signature of the quadratic form.

Thus, the numbers:

\[ (n_+, n_-, n_0) \quad \text{(positives, negatives, zeros)} \]

are invariants of the quadratic form (independent of the diagonalizing change-of-basis matrix).

5. Positive definite, negative definite, semidefinite and indefinite

Let \(Q(x)=x^T A x\) with \(A\) symmetric. Then:

  • Positive definite: \(Q(x) > 0\) for all \(x \ne 0\) — equivalently all eigenvalues of \(A\) are \(>0\).
  • Positive semidefinite: \(Q(x) \ge 0\) for all \(x\) — eigenvalues \(\ge 0\).
  • Negative definite: all eigenvalues \(<0\).
  • Indefinite: eigenvalues of both signs.

Tests for positive-definiteness (symmetric \(A\)):

  1. Check eigenvalues are all positive.
  2. Check all leading principal minors (Sylvester's criterion) are positive.

1. Positive definite

For a symmetric real matrix \(A\in\mathbb{R}^{n\times n}\), the following tests are equivalent and can be used to check positive definiteness.

a. Quadratic Form Test

\(A\) is positive definite iff for every nonzero vector \(x\in\mathbb{R}^n\): \[ x^T A x > 0. \]

Example:
\(A = \begin{bmatrix}2 & 0 \\ 0 & 3\end{bmatrix}\)

Then \(x^T A x = 2x_1^2 + 3x_2^2 > 0\) for any nonzero \(x\) so \(A\) is positive definite.

b.Eigenvalue Test

\(A\) is positive definite iff all eigenvalues of \(A\) are strictly positive.

Example:
\(A = \begin{bmatrix}4 & 1 \\ 1 & 3\end{bmatrix}\)

Eigenvalues are \(\lambda_1=5\) and \(\lambda_2=2\) (both > 0); hence \(A\) is positive definite.

c. Sylvester's Criterion (Leading Principal Minors)

A symmetric matrix is positive definite iff all its leading principal minors are positive. That is, the determinants of the top-left \(k\times k\) submatrices must all be > 0 for \(k=1,\dots,n\).

Example:
\(A = \begin{bmatrix}2 & -1 \\ -1 & 2\end{bmatrix}\)

So by Sylvester's criterion, \(A\) is positive definite.

d.Cholesky Decomposition Test

A symmetric matrix \(A\) is positive definite iff there exists a lower-triangular matrix \(L\) with positive diagonal entries such that \[ A = L L^T. \]

Example:
\(A = \begin{bmatrix}4 & 2 \\ 2 & 3\end{bmatrix}\)

Cholesky decomposition produces \(L = \begin{bmatrix}2 & 0 \\ 1 & \sqrt{2}\end{bmatrix}\) and indeed \(LL^T = A\). Therefore \(A\) is positive definite.

2.Positive SemiDefinite

A real symmetric matrix \(A\in\mathbb{R}^{n\times n}\) is positive semidefinite (PSD) if it satisfies any of the equivalent conditions below (adapted for semidefinite case):

a.Quadratic Form Test

\(A\) is PSD iff for every vector \(x\in\mathbb{R}^n\): \[ x^T A x \ge 0. \]

Example:
\(A = \begin{bmatrix}1 & -1 \\ -1 & 1\end{bmatrix}\)

For \(x=[1,1]^T\), \(x^T A x = 0\). For other \(x\), it's \(\ge0\). So \(A\) is PSD but not positive definite.

b.Eigenvalue Test

\(A\) is PSD iff all eigenvalues of \(A\) are nonnegative (\(\lambda_i \ge 0\)).

Example:
\(A = \begin{bmatrix}1 & 0 \\ 0 & 0\end{bmatrix}\)

Eigenvalues are \(1\) and \(0\) (nonnegative) → PSD.

c.Principal Minors (Necessary Condition)

For PSD matrices, all principal minors (determinants of principal submatrices) must be \(\ge0\). This is necessary but not sufficient in higher dimensions.

Example:
\(A = \begin{bmatrix}1 & -1 & 0 \\ -1 & 1 & 0 \\ 0 & 0 & 0\end{bmatrix}\)

All principal minors are \(\ge0\) but one should prefer eigenvalue check for sufficiency.

d.Factorization / Decomposition

A symmetric PSD matrix admits a decomposition \(A = B B^T\) for some (possibly rank-deficient) matrix \(B\). Equivalently, an \(LDL^T\) decomposition exists with diagonal \(D\) having nonnegative entries.

Example:
\(A = \begin{bmatrix}1 & -1 \\ -1 & 1\end{bmatrix} = \begin{bmatrix}1 \\ -1\end{bmatrix}\begin{bmatrix}1 & -1\end{bmatrix}\)

Here \(B=\begin{bmatrix}1 \\ -1\end{bmatrix}\) and \(A=BB^T\); rank is 1, so PSD but not PD.

Practical checks

Note: All these rules assume the matrix is symmetric (real) or Hermitian (complex). For non-symmetric matrices, these criteria don't apply.

3.Negitive Definite and Indefinite

For a real symmetric matrix \(A\), we can classify using quadratic forms, eigenvalues, or principal minors.

a.Negative Definite

\(A\) is negative definite if: \[ x^T A x < 0, \; \forall x \ne 0. \]

Example:
\(A = \begin{bmatrix}-2 & 0 \\ 0 & -3\end{bmatrix}\)

Eigenvalues: -2, -3 (both < 0) → Negative definite.

b.Negative Semidefinite

\(A\) is negative semidefinite (NSD) if: \[ x^T A x \le 0, \; \forall x. \]

Example:
\(A = \begin{bmatrix}0 & 0 \\ 0 & -2\end{bmatrix}\)

Eigenvalues: 0, -2 (both \(\le 0\)) → NSD.

c.Indefinite

\(A\) is indefinite if it has both positive and negative values of \(x^T A x\). Equivalently, eigenvalues are mixed (some positive, some negative).

Example:
\(A = \begin{bmatrix}1 & 0 \\ 0 & -1\end{bmatrix}\)

Eigenvalues: 1 and -1 → Indefinite. For x = [1,0], quadratic form is positive; for x = [0,1], quadratic form is negative.

Summary Table

TypeEigenvaluesQuadratic Form
Positive DefiniteAll > 0Always > 0
Positive SemidefiniteAll ≥ 0Always ≥ 0
Negative DefiniteAll < 0Always < 0
Negative SemidefiniteAll ≤ 0Always ≤ 0
IndefiniteMixed signsSometimes +, sometimes -

6. Worked Examples (step-by-step)

Example 1 — A simple 2×2 form

Reduce \(Q(x,y)=4x^2+4xy+y^2\) to principal axes and classify.

Solution

Matrix form: \(A=\begin{pmatrix}4 & 2\\2 & 1\end{pmatrix}\) because \(x^TAx = [x\ y]\begin{pmatrix}4 & 2\\2 & 1\end{pmatrix}\begin{pmatrix}x\\y\end{pmatrix} = 4x^2+4xy+y^2.\)

Eigenvalues: solve \(\det(A-\lambda I)=0\):

\[\det\begin{pmatrix}4-\lambda & 2\\2 & 1-\lambda\end{pmatrix}=(4-\lambda)(1-\lambda)-4=\lambda^2-5\lambda=\lambda(\lambda-5)=0.\]

So eigenvalues \(\lambda_1=0,\ \lambda_2=5\). Eigenvectors: for \(\lambda=5\), solve \((A-5I)v=0\) gives \(v=(1,1)^T\). For \(\lambda=0\), eigenvector \(v=(1,-2)^T\).

Orthonormalize to get \(P\). Transform \(x=Py\) to obtain \(Q=5y_2^2+0\cdot y_1^2\). Since there is both positive and zero eigenvalue, the form is positive semidefinite (not definite).

Example 2 — Completing the square (alternative)

Reduce \(Q(x,y)=3x^2+4xy+2y^2\).

Solution

Complete the square: \(3x^2+4xy+2y^2 = 3\left(x^2+\frac{4}{3}xy\right)+2y^2 = 3\left(x+\frac{2}{3}y\right)^2 - 3\cdot\frac{4}{9}y^2 + 2y^2.\)

Simplify: \(=3\left(x+\frac{2}{3}y\right)^2 + \left(2 - \frac{4}{3}\right)y^2 = 3\left(x+\frac{2}{3}y\right)^2 + \frac{2}{3}y^2.\)

Both coefficients positive — positive definite.

Example 3 — 3×3 diagonalization

Reduce \(Q(x)=x_1^2 + 2x_1x_2 + x_2^2 + x_3^2\).

Solution

Matrix: \(A=\begin{pmatrix}1 & 1 & 0\\1 & 1 & 0\\0 & 0 & 1\end{pmatrix}\). The first 2×2 block has eigenvalues 0 and 2 with eigenvectors \((1,-1,0)^T\) and \((1,1,0)^T\). The \(x_3\) direction has eigenvalue 1. Thus diagonal form: \(2y_1^2 + 1\cdot y_2^2 + 0\cdot y_3^2\) up to ordering.

Example 4 — 2×2: diagonalization with arithmetic

Reduce \(Q(x,y)=x^2+6xy+10y^2\) and classify.

Solution (step-by-step)

Matrix form: \(A=\begin{pmatrix}1 & 3\\3 & 10\end{pmatrix}\) because the cross-term 6xy is split as 3 in each off-diagonal entry.

Characteristic polynomial:

\[\det\begin{pmatrix}1-\lambda & 3\\3 & 10-\lambda\end{pmatrix}=(1-\lambda)(10-\lambda)-9=\lambda^2-11\lambda+1.\]

Solve \(\lambda^2-11\lambda+1=0\):

\[\lambda=\frac{11\pm\sqrt{121-4}}{2}=\frac{11\pm\sqrt{117}}{2}.\]

Both eigenvalues are positive (since \(11-\sqrt{117}>0\) because \(\sqrt{117}\approx10.82\)), hence \(A\) is positive definite.

Eigenvectors: compute for each eigenvalue by solving \((A-\lambda I)v=0\). Normalize to get orthonormal \(P\) and diagonal \(D\). The diagonal form will be \(\lambda_1 y_1^2 + \lambda_2 y_2^2\) with both \(\lambda_i>0\).

Example 5 — 3×3: full eigen-decomposition (explicit)

Take \(A=\begin{pmatrix}1 & 2 & 0\\2 & 1 & 0\\0 & 0 & 3\end{pmatrix}\). We'll find eigenvalues and orthonormal eigenvectors.

Solution

Block structure: top-left 2×2 block is \(B=\begin{pmatrix}1&2\\2&1\end{pmatrix}\). Characteristic polynomial of \(B\):

\[\det(B-\lambda I)=(1-\lambda)^2-4=\lambda^2-2\lambda-3=(\lambda-3)(\lambda+1).\]

So eigenvalues for the block are \(3\) and \(-1\). Eigenvectors for the 2×2 block:

  • For \(\lambda=3\): solve \((B-3I)v=0\) gives \(\begin{pmatrix}-2&2\\2&-2\end{pmatrix}v=0\) so \(v=(1,1)^T\).
  • For \(\lambda=-1\): \(v=(1,-1)^T\).

Extend to 3×3 by adding the \(x_3\) axis (eigenvalue 3). Thus eigenvalues: \(3\) (twice) and \(-1\) (once). One must orthonormalize the eigenvectors: for example take

\[u_1=\frac{1}{\sqrt{2}}(1,1,0)^T,\quad u_2=\frac{1}{\sqrt{2}}(1,-1,0)^T,\quad u_3=(0,0,1)^T.\]

Note: \(u_1\) and \(u_3\) correspond to eigenvalue 3, so there is a two-dimensional eigenspace for \(\lambda=3\). Choose any orthonormal basis of that eigenspace. The diagonal form becomes \(3y_1^2+3y_2^2-1\cdot y_3^2\) (ordering may vary). Signature: (2,1,0) — two positives, one negative.

7. Practical tips & computational remarks

8. Exercises (with answers hidden)

  1. Classify the quadratic form \(Q(x,y)=x^2+6xy+10y^2\). (Hint: diagonalize or complete square.)
  2. Show that \(Q(x)=2x_1^2+2x_1x_2+2x_2^2\) is positive semidefinite and find its rank.
  3. Find the signature (\(n_+,n_-,n_0\)) of the quadratic form given by \(A=\begin{pmatrix}0 & 1 & 0\\1 & 0 & 0\\0 & 0 & -2\end{pmatrix}.\)
  4. Use Sylvester's criterion to determine if \(A=\begin{pmatrix}2 & -1 & 0\\-1 & 2 & -1\\0 & -1 & 2\end{pmatrix}\) is positive definite.
Answers / Sketches
  1. Complete the square: \(x^2+6xy+10y^2=(x+3y)^2 + y^2\) → positive definite (both coefficients >0).
  2. Matrix \(=2\begin{pmatrix}1 & 1/2\\1/2 & 1\end{pmatrix}\) has eigenvalues 0 and 3 → positive semidefinite; rank = 1? (verify). [Careful: compute actual eigenvalues: characteristic gives \(\lambda(\lambda-3)=0\) so rank = 1? check calculation in full notes].
  3. Matrix has eigenvalues \{1,-1,-2\} so signature is (1,2,0) — one positive, two negative, no zero. (Compute eigenvalues of the 2×2 block: \(\begin{pmatrix}0&1\\1&0\end{pmatrix}\) has eigenvalues \(1,-1\)).
  4. Leading principal minors: 2 > 0, \(\det\begin{pmatrix}2&-1\\-1&2\end{pmatrix}=3>0\), full det = 4 >0 → positive definite by Sylvester.
Lecture 9 — Quadratic Forms: Full Notes

9. Interactive Quadratic Form Calculator

Enter a 2×2 or 3×3 symmetric matrix (rows separated by semicolons, entries separated by spaces):