# 2.7 MCQs-Diagonalization of Matrices

## 2.7 Diagonalization of Matrices

### Detailed Theory: Diagonalization of Matrices

#### **1. Introduction to Diagonalization**

**1.1 What is Diagonalization?**

Diagonalization is the process of transforming a square matrix into a diagonal matrix using similarity transformation.

A square matrix $$A$$ is **diagonalizable** if there exists an invertible matrix $$P$$ and a diagonal matrix $$D$$ such that:

$$
P^{-1}AP = D
$$

or equivalently:

$$
A = PDP^{-1}
$$

**1.2 Why Diagonalize?**

Diagonalization simplifies matrix operations:

1. **Easy powers:** $$A^n = PD^nP^{-1}$$
2. **Easy exponentials:** $$e^A = Pe^DP^{-1}$$
3. **System analysis:** Simplifies solving systems of differential equations
4. **Matrix functions:** Easy computation of functions of matrices

***

#### **2. Conditions for Diagonalization**

**2.1 Theorem: Diagonalizability Criterion**

An $$n \times n$$ matrix $$A$$ is diagonalizable if and only if it has $$n$$ linearly independent eigenvectors.

**Equivalently:** $$A$$ is diagonalizable if and only if the sum of dimensions of its eigenspaces equals $$n$$.

**2.2 Important Cases**

**a) Sufficient Condition**

If $$A$$ has $$n$$ distinct eigenvalues, then $$A$$ is diagonalizable.

**Proof:** Eigenvectors corresponding to distinct eigenvalues are linearly independent.

**Example:**

$$
A = \begin{bmatrix}
2 & 1 \\
0 & 3
\end{bmatrix}
$$

has eigenvalues 2 and 3 (distinct), so diagonalizable.

**b) Necessary but Not Sufficient Condition**

If $$A$$ is diagonalizable, then its eigenvalues are the diagonal entries of $$D$$.

**c) Special Cases:**

1. **Real symmetric matrices** are always diagonalizable
2. **Normal matrices** ($$AA^\* = A^\*A$$) are diagonalizable
3. **All matrices** are diagonalizable over complex numbers if we allow complex entries

***

#### **3. Steps for Diagonalization**

**3.1 The Diagonalization Process**

To diagonalize an $$n \times n$$ matrix $$A$$:

**Step 1:** Find the eigenvalues $$\lambda\_1, \lambda\_2, \ldots, \lambda\_n$$ of $$A$$

**Step 2:** For each eigenvalue $$\lambda\_i$$, find a basis for its eigenspace $$E\_{\lambda\_i}$$

**Step 3:** If the total number of independent eigenvectors equals $$n$$, then:

* $$P$$ = matrix with eigenvectors as columns
* $$D$$ = diagonal matrix with eigenvalues on diagonal

**3.2 Detailed Example**

Diagonalize $$A = \begin{bmatrix} 4 & -1 \ 2 & 1 \end{bmatrix}$$

**Step 1: Find eigenvalues**

Characteristic equation: $$\det(A - \lambda I) = 0$$

$$
\begin{vmatrix}
4-\lambda & -1 \\
2 & 1-\lambda
\end{vmatrix} = (4-\lambda)(1-\lambda) + 2 = 0
$$

$$
\lambda^2 - 5\lambda + 4 + 2 = \lambda^2 - 5\lambda + 6 = 0
$$

$$
(\lambda-2)(\lambda-3) = 0
$$

Eigenvalues: $$\lambda\_1 = 2$$, $$\lambda\_2 = 3$$

**Step 2: Find eigenvectors**

For $$\lambda\_1 = 2$$: Solve $$(A - 2I)X = O$$

$$
\begin{bmatrix}
2 & -1 \\
2 & -1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix}
$$

Equation: $$2x - y = 0 \Rightarrow y = 2x$$

Eigenvector: $$v\_1 = \begin{bmatrix} 1 \ 2 \end{bmatrix}$$ or any multiple

For $$\lambda\_2 = 3$$: Solve $$(A - 3I)X = O$$

$$
\begin{bmatrix}
1 & -1 \\
2 & -2
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix}
$$

Equation: $$x - y = 0 \Rightarrow y = x$$

Eigenvector: $$v\_2 = \begin{bmatrix} 1 \ 1 \end{bmatrix}$$ or any multiple

**Step 3: Form P and D**

$$
P = \begin{bmatrix}
1 & 1 \\
2 & 1
\end{bmatrix}
$$

(columns are eigenvectors)

$$
D = \begin{bmatrix}
2 & 0 \\
0 & 3
\end{bmatrix}
$$

(diagonal entries are eigenvalues)

**Verification:** Check $$P^{-1}AP = D$$

First find $$P^{-1}$$:

$$
P^{-1} = \frac{1}{1\cdot1 - 1\cdot2} \begin{bmatrix}
1 & -1 \\
-2 & 1
\end{bmatrix} = \begin{bmatrix}
-1 & 1 \\
2 & -1
\end{bmatrix}
$$

Check:

$$
P^{-1}AP = \begin{bmatrix}
-1 & 1 \\
2 & -1
\end{bmatrix}
\begin{bmatrix}
4 & -1 \\
2 & 1
\end{bmatrix}
\begin{bmatrix}
1 & 1 \\
2 & 1
\end{bmatrix}
$$

First compute $$AP$$:

$$
AP = \begin{bmatrix}
4 & -1 \\
2 & 1
\end{bmatrix}
\begin{bmatrix}
1 & 1 \\
2 & 1
\end{bmatrix} = \begin{bmatrix}
2 & 3 \\
4 & 3
\end{bmatrix}
$$

Now $$P^{-1}(AP)$$:

$$
\begin{bmatrix}
-1 & 1 \\
2 & -1
\end{bmatrix}
\begin{bmatrix}
2 & 3 \\
4 & 3
\end{bmatrix} = \begin{bmatrix}
2 & 0 \\
0 & 3
\end{bmatrix} = D
$$

***

#### **4. Diagonalization of Symmetric Matrices**

**4.1 Spectral Theorem for Real Symmetric Matrices**

If $$A$$ is a real symmetric matrix ($$A = A^T$$), then:

1. All eigenvalues of $$A$$ are real
2. $$A$$ is orthogonally diagonalizable
3. There exists an orthogonal matrix $$Q$$ such that:

$$
Q^TAQ = D \quad \text{or} \quad A = QDQ^T
$$

where $$Q^{-1} = Q^T$$

**4.2 Orthogonal Diagonalization Process**

For symmetric matrix $$A$$:

**Step 1:** Find eigenvalues (all real)

**Step 2:** For each eigenvalue, find eigenvectors

**Step 3:** Apply Gram-Schmidt process if needed to get orthonormal eigenvectors

**Step 4:** Form $$Q$$ with orthonormal eigenvectors as columns

**Example:** Diagonalize $$A = \begin{bmatrix} 1 & 2 \ 2 & 1 \end{bmatrix}$$ (symmetric)

**Step 1:** Eigenvalues:

$$
\det(A - \lambda I) = \begin{vmatrix}
1-\lambda & 2 \\
2 & 1-\lambda
\end{vmatrix} = (1-\lambda)^2 - 4 = \lambda^2 - 2\lambda - 3 = 0
$$

$$
(\lambda-3)(\lambda+1) = 0
$$

Eigenvalues: $$\lambda\_1 = 3$$, $$\lambda\_2 = -1$$

**Step 2:** Eigenvectors:

For $$\lambda\_1 = 3$$:

$$
\begin{bmatrix}
-2 & 2 \\
2 & -2
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix} \Rightarrow -2x + 2y = 0 \Rightarrow y = x
$$

Eigenvector: $$v\_1 = \begin{bmatrix} 1 \ 1 \end{bmatrix}$$

For $$\lambda\_2 = -1$$:

$$
\begin{bmatrix}
2 & 2 \\
2 & 2
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix} \Rightarrow 2x + 2y = 0 \Rightarrow y = -x
$$

Eigenvector: $$v\_2 = \begin{bmatrix} 1 \ -1 \end{bmatrix}$$

**Step 3:** Normalize eigenvectors:

$$
|v\_1| = \sqrt{1^2 + 1^2} = \sqrt{2}
$$

$$
q\_1 = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \ 1 \end{bmatrix} = \begin{bmatrix} \frac{1}{\sqrt{2}} \ \frac{1}{\sqrt{2}} \end{bmatrix}
$$

$$
|v\_2| = \sqrt{1^2 + (-1)^2} = \sqrt{2}
$$

$$
q\_2 = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \ -1 \end{bmatrix} = \begin{bmatrix} \frac{1}{\sqrt{2}} \ -\frac{1}{\sqrt{2}} \end{bmatrix}
$$

**Step 4:** Form $$Q$$ and $$D$$:

$$
Q = \begin{bmatrix}
\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\
\frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}}
\end{bmatrix}
$$

$$
D = \begin{bmatrix}
3 & 0 \\
0 & -1
\end{bmatrix}
$$

**Verification:**

$$
Q^TQ = I \quad \text{and} \quad Q^TAQ = D
$$

***

#### **5. Non-Diagonalizable Matrices (Defective Matrices)**

**5.1 Definition**

A matrix is **defective** if it does not have enough eigenvectors to form a basis of $$\mathbb{R}^n$$ (or $$\mathbb{C}^n$$).

**Example:** $$A = \begin{bmatrix} 1 & 1 \ 0 & 1 \end{bmatrix}$$

Eigenvalues: $$\lambda = 1$$ (double root)

Characteristic polynomial: $$(\lambda-1)^2 = 0$$

Eigenvectors: Solve $$(A-I)X = O$$

$$
\begin{bmatrix}
0 & 1 \\
0 & 0
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix} \Rightarrow y = 0
$$

Only one independent eigenvector: $$\begin{bmatrix} 1 \ 0 \end{bmatrix}$$

So $$A$$ is not diagonalizable.

**5.2 Jordan Canonical Form**

For non-diagonalizable matrices, we can use Jordan form, which is "almost" diagonal.

**Jordan Block:** For eigenvalue $$\lambda$$ with algebraic multiplicity $$m$$ but geometric multiplicity < $$m$$:

$$
J = \begin{bmatrix}
\lambda & 1 & 0 & \cdots & 0 \\
0 & \lambda & 1 & \cdots & 0 \\
\vdots & \vdots & \ddots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda & 1 \\
0 & 0 & \cdots & 0 & \lambda
\end{bmatrix}
$$

***

#### **6. Applications of Diagonalization**

**6.1 Computing Powers of Matrices**

If $$A = PDP^{-1}$$, then:

$$
A^n = PD^nP^{-1}
$$

Since $$D^n$$ is easy to compute (just raise diagonal elements to power $$n$$).

**Example:** Compute $$A^{10}$$ for $$A = \begin{bmatrix} 4 & -1 \ 2 & 1 \end{bmatrix}$$

From previous example: $$A = PDP^{-1}$$ with

$$
P = \begin{bmatrix}
1 & 1 \\
2 & 1
\end{bmatrix}, \quad
D = \begin{bmatrix}
2 & 0 \\
0 & 3
\end{bmatrix}
$$

$$
A^{10} = PD^{10}P^{-1} = \begin{bmatrix}
1 & 1 \\
2 & 1
\end{bmatrix}
\begin{bmatrix}
2^{10} & 0 \\
0 & 3^{10}
\end{bmatrix}
\begin{bmatrix}
-1 & 1 \\
2 & -1
\end{bmatrix}
$$

$$
\= \begin{bmatrix}
1 & 1 \\
2 & 1
\end{bmatrix}
\begin{bmatrix}
1024 & 0 \\
0 & 59049
\end{bmatrix}
\begin{bmatrix}
-1 & 1 \\
2 & -1
\end{bmatrix}
$$

First multiply:

$$
\begin{bmatrix}
1 & 1 \\
2 & 1
\end{bmatrix}
\begin{bmatrix}
1024 & 0 \\
0 & 59049
\end{bmatrix} = \begin{bmatrix}
1024 & 59049 \\
2048 & 59049
\end{bmatrix}
$$

Then:

$$
\begin{bmatrix}
1024 & 59049 \\
2048 & 59049
\end{bmatrix}
\begin{bmatrix}
-1 & 1 \\
2 & -1
\end{bmatrix} = \begin{bmatrix}
118074 & -58025 \\
118097 & -58025
\end{bmatrix}
$$

So

$$
A^{10} = \begin{bmatrix}
118074 & -58025 \\
118097 & -58025
\end{bmatrix}
$$

**6.2 Solving Systems of Differential Equations**

System: $$\frac{d\vec{x}}{dt} = A\vec{x}$$

Solution: $$\vec{x}(t) = e^{At}\vec{x}(0)$$

If $$A = PDP^{-1}$$, then:

$$
e^{At} = Pe^{Dt}P^{-1}
$$

where $$e^{Dt} = \begin{bmatrix} e^{\lambda\_1 t} & 0 & \cdots & 0 \ 0 & e^{\lambda\_2 t} & \cdots & 0 \ \vdots & \vdots & \ddots & \vdots \ 0 & 0 & \cdots & e^{\lambda\_n t} \end{bmatrix}$$

**6.3 Quadratic Forms**

A quadratic form: $$Q(\vec{x}) = \vec{x}^TA\vec{x}$$

If $$A$$ is symmetric, diagonalize: $$A = QDQ^T$$

Let $$\vec{y} = Q^T\vec{x}$$, then:

$$
Q(\vec{x}) = \vec{x}^T(QDQ^T)\vec{x} = (Q^T\vec{x})^TD(Q^T\vec{x}) = \vec{y}^TD\vec{y} = \sum\_{i=1}^n \lambda\_i y\_i^2
$$

This is called **principal axes transformation**.

***

#### **7. Properties and Theorems**

**7.1 Similarity Invariants**

If $$A$$ and $$B$$ are similar ($$B = P^{-1}AP$$), then they share:

1. **Same eigenvalues**
2. **Same determinant**
3. **Same trace**
4. **Same characteristic polynomial**
5. **Same minimal polynomial** (for advanced study)
6. **Same rank**

**7.2 Geometric vs Algebraic Multiplicity**

* **Algebraic multiplicity:** Number of times $$\lambda$$ appears as root of characteristic polynomial
* **Geometric multiplicity:** Dimension of eigenspace $$E\_\lambda$$

**Theorem:** For diagonalizable matrix:

$$
\text{Geometric multiplicity} = \text{Algebraic multiplicity} \quad \text{for each eigenvalue}
$$

**7.3 Cayley-Hamilton Theorem**

Every square matrix satisfies its own characteristic equation.

If $$p(\lambda) = \det(A - \lambda I)$$ is characteristic polynomial, then:

$$
p(A) = O \quad (\text{zero matrix})
$$

**Example:** For $$A = \begin{bmatrix} 1 & 2 \ 3 & 4 \end{bmatrix}$$:

Characteristic polynomial: $$\lambda^2 - 5\lambda - 2 = 0$$

Cayley-Hamilton says: $$A^2 - 5A - 2I = O$$

Verify:

$$
A^2 = \begin{bmatrix}
7 & 10 \\
15 & 22
\end{bmatrix}, \quad 5A = \begin{bmatrix}
5 & 10 \\
15 & 20
\end{bmatrix}, \quad 2I = \begin{bmatrix}
2 & 0 \\
0 & 2
\end{bmatrix}
$$

$$
A^2 - 5A - 2I = \begin{bmatrix}
0 & 0 \\
0 & 0
\end{bmatrix}
$$

***

#### **8. Special Cases and Examples**

**8.1 Diagonalization of 3×3 Matrix**

Diagonalize $$A = \begin{bmatrix} 2 & 0 & 0 \ 1 & 2 & 0 \ 0 & 1 & 3 \end{bmatrix}$$

**Step 1:** Eigenvalues (triangular matrix):

Eigenvalues = diagonal entries: $$\lambda\_1 = 2$$, $$\lambda\_2 = 2$$, $$\lambda\_3 = 3$$

**Step 2:** Eigenvectors:

For $$\lambda = 2$$ (multiplicity 2): Solve $$(A-2I)X = O$$

$$
\begin{bmatrix}
0 & 0 & 0 \\
1 & 0 & 0 \\
0 & 1 & 1
\end{bmatrix}
\begin{bmatrix}
x \\
y \\
z
\end{bmatrix} = \begin{bmatrix}
0 \\
0 \\
0
\end{bmatrix}
$$

Equations: $$x = 0$$, $$y + z = 0 \Rightarrow z = -y$$

Eigenvectors: $$\begin{bmatrix} 0 \ 1 \ -1 \end{bmatrix}$$ (only one!)

For $$\lambda = 3$$: Solve $$(A-3I)X = O$$

$$
\begin{bmatrix}
-1 & 0 & 0 \\
1 & -1 & 0 \\
0 & 1 & 0
\end{bmatrix}
\begin{bmatrix}
x \\
y \\
z
\end{bmatrix} = \begin{bmatrix}
0 \\
0 \\
0
\end{bmatrix}
$$

Equations: $$-x = 0 \Rightarrow x = 0$$, $$x - y = 0 \Rightarrow y = 0$$, $$y = 0$$, $$z$$ free

Eigenvector: $$\begin{bmatrix} 0 \ 0 \ 1 \end{bmatrix}$$

**Conclusion:** Only 2 independent eigenvectors, so $$A$$ is NOT diagonalizable.

**8.2 Orthogonal Diagonalization Example**

Orthogonally diagonalize $$A = \begin{bmatrix} 2 & 1 \ 1 & 2 \end{bmatrix}$$

**Step 1:** Eigenvalues:

$$
\det(A - \lambda I) = \begin{vmatrix}
2-\lambda & 1 \\
1 & 2-\lambda
\end{vmatrix} = (2-\lambda)^2 - 1 = \lambda^2 - 4\lambda + 3 = 0
$$

$$
(\lambda-1)(\lambda-3) = 0
$$

Eigenvalues: $$\lambda\_1 = 1$$, $$\lambda\_2 = 3$$

**Step 2:** Eigenvectors:

For $$\lambda = 1$$:

$$
\begin{bmatrix}
1 & 1 \\
1 & 1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix} \Rightarrow x + y = 0 \Rightarrow y = -x
$$

Eigenvector: $$v\_1 = \begin{bmatrix} 1 \ -1 \end{bmatrix}$$

For $$\lambda = 3$$:

$$
\begin{bmatrix}
-1 & 1 \\
1 & -1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix} \Rightarrow -x + y = 0 \Rightarrow y = x
$$

Eigenvector: $$v\_2 = \begin{bmatrix} 1 \ 1 \end{bmatrix}$$

**Step 3:** Normalize:

$$
|v\_1| = \sqrt{2}, \quad q\_1 = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \ -1 \end{bmatrix}
$$

$$
|v\_2| = \sqrt{2}, \quad q\_2 = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \ 1 \end{bmatrix}
$$

**Step 4:** Form matrices:

$$
Q = \begin{bmatrix}
\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\
-\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}}
\end{bmatrix}, \quad
D = \begin{bmatrix}
1 & 0 \\
0 & 3
\end{bmatrix}
$$

***

#### **9. Criteria for Diagonalization**

**9.1 Necessary and Sufficient Conditions**

$$n \times n$$ matrix $$A$$ is diagonalizable if and only if:

1. The characteristic polynomial factors completely into linear factors
2. For each eigenvalue $$\lambda$$, algebraic multiplicity = geometric multiplicity

**9.2 Test for Diagonalizability**

1. Find all eigenvalues
2. For each eigenvalue, find dimension of eigenspace (solve $$(A-\lambda I)X = O$$)
3. Sum of dimensions of all eigenspaces = $$n$$

**9.3 Quick Tests**

* **Yes:** $$n$$ distinct eigenvalues
* **Yes:** Symmetric/Hermitian matrix
* **No:** Number of independent eigenvectors < $$n$$
* **Maybe:** Check algebraic = geometric multiplicity for repeated eigenvalues

***

#### **10. Important Theorems**

**10.1 Spectral Theorem**

For complex case: Normal matrix ($$AA^\* = A^\*A$$) is unitarily diagonalizable.

For real case: Symmetric matrix is orthogonally diagonalizable.

**10.2 Schur's Lemma**

Every square matrix is unitarily similar to an upper triangular matrix.

**10.3 Jordan's Theorem**

Every square matrix is similar to a Jordan canonical form (almost diagonal).

***

#### **11. Computational Aspects**

**11.1 Numerical Stability**

Diagonalization algorithms can be numerically unstable for:

* Nearly defective matrices
* Matrices with closely spaced eigenvalues
* Ill-conditioned matrices

**11.2 Algorithms**

1. **QR algorithm:** For finding eigenvalues and eigenvectors
2. **Power method:** For dominant eigenvalue
3. **Jacobi method:** For symmetric matrices

**11.3 Software Tools**

* MATLAB/Octave: `eig(A)`
* Python: `numpy.linalg.eig(A)`
* Mathematica: `Eigenvalues[A]`, `Eigenvectors[A]`

***

#### **12. Practice Problems**

**Problem 1:**

Determine if $$A = \begin{bmatrix} 1 & 2 \ 0 & 1 \end{bmatrix}$$ is diagonalizable.

**Solution:** Eigenvalues: $$\lambda = 1$$ (double root)

Solve $$(A-I)X = O$$:

$$
\begin{bmatrix}
0 & 2 \\
0 & 0
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0 \end{bmatrix} \Rightarrow 2y = 0 \Rightarrow y = 0
$$

Only one independent eigenvector: $$\begin{bmatrix} 1 \ 0 \end{bmatrix}$$

So NOT diagonalizable.

**Problem 2:**

Diagonalize $$A = \begin{bmatrix} 5 & -2 \ 4 & -1 \end{bmatrix}$$

**Solution:** Characteristic equation:

$$
\begin{vmatrix}
5-\lambda & -2 \\
4 & -1-\lambda
\end{vmatrix} = (5-\lambda)(-1-\lambda) + 8 = \lambda^2 - 4\lambda + 3 = 0
$$

Eigenvalues: $$\lambda = 1, 3$$

For $$\lambda = 1$$:

$$
\begin{bmatrix}
4 & -2 \\
4 & -2
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix} \Rightarrow 4x - 2y = 0 \Rightarrow y = 2x
$$

Eigenvector: $$v\_1 = \begin{bmatrix} 1 \ 2 \end{bmatrix}$$

For $$\lambda = 3$$:

$$
\begin{bmatrix}
2 & -2 \\
4 & -4
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix} \Rightarrow 2x - 2y = 0 \Rightarrow y = x
$$

Eigenvector: $$v\_2 = \begin{bmatrix} 1 \ 1 \end{bmatrix}$$

$$
P = \begin{bmatrix}
1 & 1 \\
2 & 1
\end{bmatrix}, \quad
D = \begin{bmatrix}
1 & 0 \\
0 & 3
\end{bmatrix}
$$

***

#### **13. Exam Tips**

**13.1 Common Mistakes**

1. **Assuming diagonalizable** when eigenvalues are repeated
2. **Forgetting to check** linear independence of eigenvectors
3. **Wrong order** in $$P$$ and $$D$$ (eigenvectors in $$P$$ must match eigenvalues in $$D$$)
4. **Not verifying** $$P^{-1}AP = D$$

**13.2 Problem-Solving Strategy**

1. **Step 1:** Find characteristic polynomial and eigenvalues
2. **Step 2:** For each eigenvalue, find eigenvectors
3. **Step 3:** Check if enough independent eigenvectors
4. **Step 4:** Form $$P$$ and $$D$$
5. **Step 5:** Verify if asked

**13.3 Quick Checks**

* **Distinct eigenvalues** ⇒ Diagonalizable
* **Symmetric matrix** ⇒ Orthogonally diagonalizable
* **Number of eigenvectors** < $$n$$ ⇒ Not diagonalizable
* **Algebraic multiplicity** > geometric multiplicity for any eigenvalue ⇒ Not diagonalizable

This comprehensive theory covers all aspects of matrix diagonalization with detailed explanations and examples, providing complete preparation for the entrance examination.
