# 2.6 Matrices and Determinants

## Detailed Theory: Matrices and Determinants

### **1. Introduction to Matrices**

#### **1.1 What is a Matrix?**

A matrix is a rectangular array of numbers, symbols, or expressions arranged in rows and columns.

**Example:**

$$
\begin{bmatrix}
1 & 2 & 3 \\
4 & 5 & 6
\end{bmatrix}
$$

This is a $$2 \times 3$$ matrix (2 rows, 3 columns).

#### **1.2 Notation**

* Matrices are usually denoted by capital letters: $$A, B, C, \ldots$$
* The element in the $$i$$-th row and $$j$$-th column is denoted by $$a\_{ij}$$
* A matrix with $$m$$ rows and $$n$$ columns is called an $$m \times n$$ matrix

**General form:**

$$
A = \begin{bmatrix}
a\_{11} & a\_{12} & \cdots & a\_{1n} \\
a\_{21} & a\_{22} & \cdots & a\_{2n} \\
\vdots & \vdots & \ddots & \vdots \\
a\_{m1} & a\_{m2} & \cdots & a\_{mn}
\end{bmatrix}
$$

#### **1.3 Types of Matrices**

**a) Row Matrix**

A matrix with only one row.

**Example:**

$$
\begin{bmatrix}
1 & 2 & 3
\end{bmatrix}
$$

($$1 \times 3$$)

**b) Column Matrix**

A matrix with only one column.

**Example:**

$$
\begin{bmatrix}
4 \\
5 \\
6
\end{bmatrix}
$$

($$3 \times 1$$)

**c) Zero Matrix (Null Matrix)**

A matrix with all elements zero. Denoted by $$O$$.

**Example:**

$$
\begin{bmatrix}
0 & 0 \\
0 & 0
\end{bmatrix}
$$

($$2 \times 2$$ zero matrix)

**d) Square Matrix**

A matrix with same number of rows and columns ($$m = n$$).

**Example:**

$$
\begin{bmatrix}
1 & 2 \\
3 & 4
\end{bmatrix}
$$

($$2 \times 2$$)

**e) Diagonal Matrix**

A square matrix where all non-diagonal elements are zero.

**Example:**

$$
\begin{bmatrix}
2 & 0 & 0 \\
0 & 3 & 0 \\
0 & 0 & 5
\end{bmatrix}
$$

**f) Scalar Matrix**

A diagonal matrix where all diagonal elements are equal.

**Example:**

$$
\begin{bmatrix}
k & 0 & 0 \\
0 & k & 0 \\
0 & 0 & k
\end{bmatrix}
$$

**g) Identity Matrix**

A scalar matrix with diagonal elements = 1. Denoted by $$I$$ or $$I\_n$$.

**Examples:**

$$
I\_2 = \begin{bmatrix}
1 & 0 \\
0 & 1
\end{bmatrix}
$$

$$
I\_3 = \begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 1
\end{bmatrix}
$$

**h) Upper Triangular Matrix**

A square matrix where all elements below the main diagonal are zero.

**Example:**

$$
\begin{bmatrix}
1 & 2 & 3 \\
0 & 4 & 5 \\
0 & 0 & 6
\end{bmatrix}
$$

**i) Lower Triangular Matrix**

A square matrix where all elements above the main diagonal are zero.

**Example:**

$$
\begin{bmatrix}
1 & 0 & 0 \\
2 & 3 & 0 \\
4 & 5 & 6
\end{bmatrix}
$$

**j) Symmetric Matrix**

A square matrix that equals its transpose: $$A^T = A$$

**Example:**

$$
\begin{bmatrix}
1 & 2 & 3 \\
2 & 4 & 5 \\
3 & 5 & 6
\end{bmatrix}
$$

**k) Skew-Symmetric Matrix**

A square matrix that equals the negative of its transpose: $$A^T = -A$$ Diagonal elements must be zero.

**Example:**

$$
\begin{bmatrix}
0 & 2 & -3 \\
-2 & 0 & 4 \\
3 & -4 & 0
\end{bmatrix}
$$

***

### **2. Matrix Operations**

#### **2.1 Equality of Matrices**

Two matrices $$A$$ and $$B$$ are equal if:

1. They have same dimensions
2. Corresponding elements are equal

$$
A = B \iff a\_{ij} = b\_{ij} \quad \text{for all } i, j
$$

#### **2.2 Addition of Matrices**

Matrices of same dimensions can be added element-wise.

If $$A = \[a\_{ij}]$$ and $$B = \[b\_{ij}]$$ are both $$m \times n$$, then:

$$
A + B = \[a\_{ij} + b\_{ij}]
$$

**Properties:**

1. **Commutative:** $$A + B = B + A$$
2. **Associative:** $$(A + B) + C = A + (B + C)$$
3. **Additive Identity:** $$A + O = A$$
4. **Additive Inverse:** $$A + (-A) = O$$

#### **2.3 Subtraction of Matrices**

$$
A - B = A + (-B) = \[a\_{ij} - b\_{ij}]
$$

#### **2.4 Scalar Multiplication**

If $$k$$ is a scalar and $$A = \[a\_{ij}]$$, then:

$$
kA = \[k \cdot a\_{ij}]
$$

**Properties:**

1. $$k(A + B) = kA + kB$$
2. $$(k + l)A = kA + lA$$
3. $$k(lA) = (kl)A$$
4. $$1 \cdot A = A$$

#### **2.5 Matrix Multiplication**

**a) Condition for Multiplication**

Matrix $$A$$ ($$m \times n$$) can multiply matrix $$B$$ ($$p \times q$$) only if:

$$
n = p
$$

The product $$AB$$ will have dimensions $$m \times q$$.

**b) Multiplication Process**

If $$A = \[a\_{ij}]$$ is $$m \times n$$ and $$B = \[b\_{jk}]$$ is $$n \times p$$, then:

$$
C = AB \quad \text{where} \quad c\_{ik} = \sum\_{j=1}^{n} a\_{ij} b\_{jk}
$$

Element $$c\_{ik}$$ is dot product of $$i$$-th row of $$A$$ and $$k$$-th column of $$B$$.

**Example:**

$$
A = \begin{bmatrix}
1 & 2 \\
3 & 4
\end{bmatrix}, \quad
B = \begin{bmatrix}
5 & 6 \\
7 & 8
\end{bmatrix}
$$

$$
AB = \begin{bmatrix}
(1\times5 + 2\times7) & (1\times6 + 2\times8) \\
(3\times5 + 4\times7) & (3\times6 + 4\times8)
\end{bmatrix}
$$

$$
AB = \begin{bmatrix}
19 & 22 \\
43 & 50
\end{bmatrix}
$$

**c) Properties of Matrix Multiplication**

1. **Not commutative:** $$AB \neq BA$$ in general
2. **Associative:** $$A(BC) = (AB)C$$
3. **Distributive:** $$A(B + C) = AB + AC$$
4. **Multiplicative Identity:** $$AI = IA = A$$
5. **Multiplication with zero matrix:** $$AO = OA = O$$

#### **2.6 Transpose of a Matrix**

The transpose of matrix $$A$$, denoted $$A^T$$ or $$A'$$, is obtained by interchanging rows and columns.

If $$A = \[a\_{ij}]$$ is $$m \times n$$, then $$A^T = \[a\_{ji}]$$ is $$n \times m$$.

**Properties:**

1. $$(A^T)^T = A$$
2. $$(A + B)^T = A^T + B^T$$
3. $$(kA)^T = kA^T$$
4. $$(AB)^T = B^T A^T$$

#### **2.7 Trace of a Matrix**

For a square matrix $$A$$, the trace is sum of diagonal elements:

$$
\text{tr}(A) = \sum\_{i=1}^{n} a\_{ii}
$$

**Properties:**

1. $$\text{tr}(A + B) = \text{tr}(A) + \text{tr}(B)$$
2. $$\text{tr}(kA) = k \cdot \text{tr}(A)$$
3. $$\text{tr}(AB) = \text{tr}(BA)$$

***

### **3. Determinants**

#### **3.1 Definition**

Determinant is a scalar value computed from a square matrix.

**Notation:** $$\det(A)$$ or $$|A|$$

#### **3.2 Determinant of 2×2 Matrix**

For $$A = \begin{bmatrix} a & b \ c & d \end{bmatrix}$$:

$$
\det(A) = ad - bc
$$

#### **3.3 Determinant of 3×3 Matrix**

For $$A = \begin{bmatrix} a\_{11} & a\_{12} & a\_{13} \ a\_{21} & a\_{22} & a\_{23} \ a\_{31} & a\_{32} & a\_{33} \end{bmatrix}$$:

$$
\det(A) = a\_{11}(a\_{22}a\_{33} - a\_{23}a\_{32}) - a\_{12}(a\_{21}a\_{33} - a\_{23}a\_{31}) + a\_{13}(a\_{21}a\_{32} - a\_{22}a\_{31})
$$

This can be remembered as **Sarrus' Rule**:

Write first two columns again to the right:

$$
\begin{vmatrix}
a\_{11} & a\_{12} & a\_{13} \\
a\_{21} & a\_{22} & a\_{23} \\
a\_{31} & a\_{32} & a\_{33}
\end{vmatrix}
$$

Sum of products of diagonals from left to right minus sum of products of diagonals from right to left.

#### **3.4 Minors and Cofactors**

**a) Minor**

The minor $$M\_{ij}$$ of element $$a\_{ij}$$ is the determinant of the submatrix obtained by deleting the $$i$$-th row and $$j$$-th column.

**b) Cofactor**

The cofactor $$C\_{ij}$$ of element $$a\_{ij}$$ is:

$$
C\_{ij} = (-1)^{i+j} M\_{ij}
$$

**Example:** For $$A = \begin{bmatrix} 1 & 2 & 3 \ 4 & 5 & 6 \ 7 & 8 & 9 \end{bmatrix}$$:

Minor $$M\_{11}$$ = determinant of $$\begin{bmatrix} 5 & 6 \ 8 & 9 \end{bmatrix} = 5\times9 - 6\times8 = 45 - 48 = -3$$

Cofactor $$C\_{11} = (-1)^{1+1} M\_{11} = (-1)^2 \times (-3) = -3$$

#### **3.5 Expansion by Cofactors**

Determinant can be computed by expanding along any row or column:

**Along** $$i$$**-th row:**

$$
\det(A) = \sum\_{j=1}^{n} a\_{ij} C\_{ij}
$$

**Along** $$j$$**-th column:**

$$
\det(A) = \sum\_{i=1}^{n} a\_{ij} C\_{ij}
$$

#### **3.6 Properties of Determinants**

**a) Basic Properties**

1. $$\det(I) = 1$$
2. $$\det(O) = 0$$
3. $$\det(A^T) = \det(A)$$
4. If two rows (or columns) are identical, $$\det(A) = 0$$
5. If a row (or column) has all zeros, $$\det(A) = 0$$

**b) Row/Column Operations**

Let $$A$$ be an $$n \times n$$ matrix.

1. **Row interchange:** Swapping two rows changes sign of determinant
2. **Scalar multiplication:** Multiplying a row by $$k$$ multiplies determinant by $$k$$
3. **Row addition:** Adding a multiple of one row to another doesn't change determinant

**c) Multiplication Property**

$$
\det(AB) = \det(A) \cdot \det(B)
$$

**d) Inverse Property**

If $$A$$ is invertible,

$$
\det(A^{-1}) = \frac{1}{\det(A)}
$$

#### **3.7 Special Determinants**

**a) Diagonal Matrix**

Determinant = product of diagonal elements.

$$
\det\begin{bmatrix}
a & 0 & 0 \\
0 & b & 0 \\
0 & 0 & c
\end{bmatrix} = abc
$$

**b) Triangular Matrix**

Determinant = product of diagonal elements.

**c) Vandermonde Determinant**

$$
\begin{vmatrix}
1 & 1 & 1 \\
x & y & z \\
x^2 & y^2 & z^2
\end{vmatrix} = (y-x)(z-x)(z-y)
$$

***

### **4. Inverse of a Matrix**

#### **4.1 Definition**

A square matrix $$A$$ is invertible if there exists matrix $$B$$ such that:

$$
AB = BA = I
$$

$$B$$ is called the inverse of $$A$$, denoted $$A^{-1}$$.

**Note:** Only square matrices can be invertible, but not all square matrices are invertible.

#### **4.2 Condition for Invertibility**

A square matrix $$A$$ is invertible if and only if:

$$
\det(A) \neq 0
$$

If $$\det(A) = 0$$, $$A$$ is called **singular** or **non-invertible**.

#### **4.3 Finding Inverse**

**a) Formula for 2×2 Matrix**

For $$A = \begin{bmatrix} a & b \ c & d \end{bmatrix}$$:

If $$\det(A) = ad - bc \neq 0$$, then:

$$
A^{-1} = \frac{1}{ad-bc} \begin{bmatrix}
d & -b \\
-c & a
\end{bmatrix}
$$

**Example:** $$A = \begin{bmatrix} 2 & 3 \ 1 & 4 \end{bmatrix}$$

$$
\det(A) = 2\times4 - 3\times1 = 8 - 3 = 5 \neq 0
$$

$$
A^{-1} = \frac{1}{5} \begin{bmatrix}
4 & -3 \\
-1 & 2
\end{bmatrix} = \begin{bmatrix}
\frac{4}{5} & -\frac{3}{5} \\
-\frac{1}{5} & \frac{2}{5}
\end{bmatrix}
$$

**b) Using Adjoint (for n×n)**

The adjoint of $$A$$, denoted $$\text{adj}(A)$$, is the transpose of the cofactor matrix.

$$
A^{-1} = \frac{1}{\det(A)} \text{adj}(A)
$$

**Steps:**

1. Find cofactor matrix $$C$$ where $$C\_{ij} = (-1)^{i+j} M\_{ij}$$
2. Transpose to get adjoint: $$\text{adj}(A) = C^T$$
3. Divide by $$\det(A)$$

**c) Using Elementary Row Operations (Gauss-Jordan)**

Augment $$A$$ with $$I$$: $$\[A | I]$$

Apply row operations to transform $$A$$ to $$I$$

The right side becomes $$A^{-1}$$: $$\[I | A^{-1}]$$

#### **4.4 Properties of Inverse**

1. $$(A^{-1})^{-1} = A$$
2. $$(AB)^{-1} = B^{-1} A^{-1}$$
3. $$(A^T)^{-1} = (A^{-1})^T$$
4. $$\det(A^{-1}) = \frac{1}{\det(A)}$$
5. $$(kA)^{-1} = \frac{1}{k} A^{-1}$$ for $$k \neq 0$$

***

### **5. Systems of Linear Equations**

#### **5.1 Matrix Representation**

A system of $$m$$ linear equations in $$n$$ variables:

$$
a\_{11}x\_1 + a\_{12}x\_2 + \cdots + a\_{1n}x\_n = b\_1
$$

$$
a\_{21}x\_1 + a\_{22}x\_2 + \cdots + a\_{2n}x\_n = b\_2
$$

$$\vdots$$

$$
a\_{m1}x\_1 + a\_{m2}x\_2 + \cdots + a\_{mn}x\_n = b\_m
$$

Can be written as: $$AX = B$$

where

$$
A = \begin{bmatrix}
a\_{11} & \cdots & a\_{1n} \\
\vdots & \ddots & \vdots \\
a\_{m1} & \cdots & a\_{mn}
\end{bmatrix},
\quad
X = \begin{bmatrix}
x\_1 \\
\vdots \\
x\_n
\end{bmatrix},
\quad
B = \begin{bmatrix}
b\_1 \\
\vdots \\
b\_m
\end{bmatrix}
$$

#### **5.2 Solution Methods**

**a) Using Inverse (for n×n systems)**

If $$A$$ is square and invertible, solution is:

$$
X = A^{-1}B
$$

**b) Using Cramer's Rule**

For system $$AX = B$$ where $$A$$ is $$n \times n$$ and $$\det(A) \neq 0$$:

$$
x\_i = \frac{\det(A\_i)}{\det(A)}
$$

where $$A\_i$$ is matrix obtained by replacing $$i$$-th column of $$A$$ with $$B$$.

**Example:** Solve:

$$
2x + 3y = 8
$$

$$
x + 4y = 6
$$

In matrix form:

$$
\begin{bmatrix}
2 & 3 \\
1 & 4
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
8 \\
6
\end{bmatrix}
$$

$$
\det(A) = 2\times4 - 3\times1 = 8 - 3 = 5
$$

$$
x = \frac{\begin{vmatrix}
8 & 3 \\
6 & 4
\end{vmatrix}}{\det(A)} = \frac{32 - 18}{5} = \frac{14}{5}
$$

$$
y = \frac{\begin{vmatrix}
2 & 8 \\
1 & 6
\end{vmatrix}}{\det(A)} = \frac{12 - 8}{5} = \frac{4}{5}
$$

#### **5.3 Consistency of Systems**

For $$AX = B$$:

1. **Consistent:** Has at least one solution
2. **Inconsistent:** Has no solution

**Rank Method:** Let $$\[A|B]$$ be augmented matrix.

System is:

* **Consistent** if $$\text{rank}(A) = \text{rank}(\[A|B])$$
* **Inconsistent** if $$\text{rank}(A) \neq \text{rank}(\[A|B])$$

If consistent:

* **Unique solution** if $$\text{rank}(A) = n$$ (number of variables)
* **Infinite solutions** if $$\text{rank}(A) < n$$

#### **5.4 Homogeneous Systems**

System $$AX = O$$ (all $$b\_i = 0$$)

**Properties:**

1. Always consistent (trivial solution $$X = O$$ exists)
2. Has non-trivial solutions if and only if $$\det(A) = 0$$
3. If non-trivial solutions exist, they are infinite in number

***

### **6. Eigenvalues and Eigenvectors**

#### **6.1 Definition**

For square matrix $$A$$, a non-zero vector $$X$$ is an eigenvector if:

$$
AX = \lambda X \quad \text{for some scalar } \lambda
$$

$$\lambda$$ is called the eigenvalue corresponding to eigenvector $$X$$.

#### **6.2 Finding Eigenvalues**

The equation $$AX = \lambda X$$ can be written as:

$$
(A - \lambda I)X = O
$$

For non-zero solution $$X$$, we need:

$$
\det(A - \lambda I) = 0
$$

This is called the **characteristic equation**.

#### **6.3 Steps to Find Eigenvalues and Eigenvectors**

1. Form $$A - \lambda I$$
2. Set $$\det(A - \lambda I) = 0$$, solve for $$\lambda$$ (eigenvalues)
3. For each $$\lambda$$, solve $$(A - \lambda I)X = O$$ to find eigenvectors

**Example:** Find eigenvalues and eigenvectors of $$A = \begin{bmatrix} 2 & 1 \ 1 & 2 \end{bmatrix}$$

**Step 1:**

$$
A - \lambda I = \begin{bmatrix}
2-\lambda & 1 \\
1 & 2-\lambda
\end{bmatrix}
$$

**Step 2:**

$$
\det(A - \lambda I) = (2-\lambda)^2 - 1 = \lambda^2 - 4\lambda + 3 = 0
$$

$$
(\lambda-1)(\lambda-3) = 0
$$

Eigenvalues: $$\lambda\_1 = 1$$, $$\lambda\_2 = 3$$

**Step 3:** For $$\lambda\_1 = 1$$:

Solve

$$
\begin{bmatrix}
1 & 1 \\
1 & 1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix}
$$

Equation: $$x + y = 0 \Rightarrow y = -x$$

Eigenvector: $$\begin{bmatrix} 1 \ -1 \end{bmatrix}$$ or any multiple

For $$\lambda\_2 = 3$$:

Solve

$$
\begin{bmatrix}
-1 & 1 \\
1 & -1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
0 \\
0
\end{bmatrix}
$$

Equation: $$-x + y = 0 \Rightarrow y = x$$

Eigenvector: $$\begin{bmatrix} 1 \ 1 \end{bmatrix}$$ or any multiple

#### **6.4 Properties**

1. Sum of eigenvalues = trace of $$A$$
2. Product of eigenvalues = determinant of $$A$$
3. Eigenvalues of diagonal/triangular matrix = diagonal elements
4. If $$\lambda$$ is eigenvalue of $$A$$, then for polynomial $$p(A)$$, $$p(\lambda)$$ is eigenvalue

***

### **7. Rank of a Matrix**

#### **7.1 Definition**

The rank of a matrix is the maximum number of linearly independent rows (or columns).

**Notation:** $$\text{rank}(A)$$

#### **7.2 Finding Rank**

**a) Using Row Echelon Form**

Transform matrix to row echelon form using elementary row operations.

Rank = number of non-zero rows.

**b) Using Determinants**

Rank is the size of largest non-zero minor.

#### **7.3 Properties**

1. $$\text{rank}(A) = \text{rank}(A^T)$$
2. $$\text{rank}(A) \leq \min(m, n)$$ for $$m \times n$$ matrix
3. If $$\text{rank}(A) = n$$ for $$n \times n$$ matrix, then $$A$$ is invertible
4. $$\text{rank}(AB) \leq \min(\text{rank}(A), \text{rank}(B))$$

***

### **8. Special Matrices and Properties**

#### **8.1 Orthogonal Matrix**

A square matrix $$A$$ is orthogonal if:

$$
A^T A = AA^T = I
$$

Equivalently: $$A^T = A^{-1}$$

**Properties:**

1. Columns (and rows) are orthonormal vectors
2. $$\det(A) = \pm 1$$
3. Preserves length: $$|AX| = |X|$$

#### **8.2 Idempotent Matrix**

A square matrix $$A$$ is idempotent if:

$$
A^2 = A
$$

**Example:**

$$
\begin{bmatrix}
1 & 0 \\
0 & 0
\end{bmatrix}
$$

#### **8.3 Nilpotent Matrix**

A square matrix $$A$$ is nilpotent if:

$$
A^k = O \quad \text{for some positive integer } k
$$

**Example:**

$$
\begin{bmatrix}
0 & 1 \\
0 & 0
\end{bmatrix}
$$

($$A^2 = O$$)

#### **8.4 Involutory Matrix**

A square matrix $$A$$ is involutory if:

$$
A^2 = I
$$

Equivalently: $$A^{-1} = A$$

**Example:**

$$
\begin{bmatrix}
1 & 0 \\
0 & -1
\end{bmatrix}
$$

***

### **9. Matrix Equations**

#### **9.1 Solving** $$AX = B$$

If $$A$$ is invertible: $$X = A^{-1}B$$

#### **9.2 Solving** $$XA = B$$

If $$A$$ is invertible: $$X = BA^{-1}$$

#### **9.3 Sylvester Equation**

$$AX + XB = C$$

Solution involves Kronecker product.

#### **9.4 Lyapunov Equation**

$$A^TX + XA = -Q$$

Important in control theory.

***

### **10. Applications**

#### **10.1 Computer Graphics**

Matrices used for transformations:

* Translation
* Rotation
* Scaling
* Shearing

#### **10.2 Cryptography**

Matrices used in encryption algorithms.

#### **10.3 Economics**

Input-output analysis using Leontief models.

#### **10.4 Physics**

Quantum mechanics, rotation matrices, stress tensors.

#### **10.5 Statistics**

Covariance matrices, multivariate analysis.

***

### **11. Solved Examples**

#### **Example 1:** Matrix Multiplication

If $$A = \begin{bmatrix} 1 & 2 \ 3 & 4 \end{bmatrix}$$ and $$B = \begin{bmatrix} 5 & 6 \ 7 & 8 \end{bmatrix}$$, find $$AB$$ and $$BA$$.

**Solution:**

$$
AB = \begin{bmatrix}
1\times5 + 2\times7 & 1\times6 + 2\times8 \\
3\times5 + 4\times7 & 3\times6 + 4\times8
\end{bmatrix}
$$

$$
AB = \begin{bmatrix}
19 & 22 \\
43 & 50
\end{bmatrix}
$$

$$
BA = \begin{bmatrix}
5\times1 + 6\times3 & 5\times2 + 6\times4 \\
7\times1 + 8\times3 & 7\times2 + 8\times4
\end{bmatrix}
$$

$$
BA = \begin{bmatrix}
23 & 34 \\
31 & 46
\end{bmatrix}
$$

Note: $$AB \neq BA$$

#### **Example 2:** Determinant Calculation

Find

$$
\det\begin{bmatrix}
1 & 2 & 3 \\
4 & 5 & 6 \\
7 & 8 & 9
\end{bmatrix}
$$

**Solution:**

Using Sarrus' rule:

Sum of left-to-right diagonals:

$$
(1\times5\times9) + (2\times6\times7) + (3\times4\times8) = 45 + 84 + 96 = 225
$$

Sum of right-to-left diagonals:

$$
(3\times5\times7) + (1\times6\times8) + (2\times4\times9) = 105 + 48 + 72 = 225
$$

Determinant = $$225 - 225 = 0$$

#### **Example 3:** Inverse Calculation

Find inverse of $$A = \begin{bmatrix} 2 & 5 \ 1 & 3 \end{bmatrix}$$

**Solution:**

$$
\det(A) = 2\times3 - 5\times1 = 6 - 5 = 1
$$

$$
A^{-1} = \frac{1}{1} \begin{bmatrix}
3 & -5 \\
-1 & 2
\end{bmatrix} = \begin{bmatrix}
3 & -5 \\
-1 & 2
\end{bmatrix}
$$

#### **Example 4:** System of Equations

Solve using matrices:

$$
2x + 3y = 11
$$

$$
x + 2y = 6
$$

**Solution:**

Matrix form:

$$
\begin{bmatrix}
2 & 3 \\
1 & 2
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
11 \\
6
\end{bmatrix}
$$

$$
\det(A) = 4 - 3 = 1 \neq 0
$$

$$
A^{-1} = \begin{bmatrix}
2 & -3 \\
-1 & 2
\end{bmatrix}
$$

$$
\begin{bmatrix}
x \\
y
\end{bmatrix} = \begin{bmatrix}
2 & -3 \\
-1 & 2
\end{bmatrix}
\begin{bmatrix}
11 \\
6
\end{bmatrix} = \begin{bmatrix}
22-18 \\
-11+12
\end{bmatrix} = \begin{bmatrix}
4 \\
1
\end{bmatrix}
$$

So $$x=4$$, $$y=1$$

***

### **12. Important Formulas Summary**

#### **12.1 Determinants**

* **2×2:**

$$
\begin{vmatrix}
a & b \\
c & d
\end{vmatrix} = ad - bc
$$

* **3×3:** Use Sarrus' rule or cofactor expansion

#### **12.2 Inverse**

* **2×2:**

$$
\begin{bmatrix}
a & b \\
c & d
\end{bmatrix}^{-1} = \frac{1}{ad-bc}\begin{bmatrix}
d & -b \\
-c & a
\end{bmatrix}
$$

* **General:** $$A^{-1} = \frac{1}{\det(A)}\text{adj}(A)$$

#### **12.3 Eigenvalues**

Solve: $$\det(A - \lambda I) = 0$$

#### **12.4 Trace**

$$
\text{tr}(A) = \sum a\_{ii}
$$

***

### **13. Exam Tips and Common Mistakes**

#### **13.1 Common Mistakes**

1. **Matrix multiplication:** Not checking dimensions compatibility
2. **Inverse:** Forgetting to check $$\det(A) \neq 0$$ first
3. **Determinant:** Incorrect sign in cofactor expansion
4. **Eigenvectors:** Forgetting eigenvectors are defined up to scalar multiple
5. **Transpose:** $$(AB)^T = B^TA^T$$ not $$A^TB^T$$

#### **13.2 Problem-Solving Strategy**

1. **Identify matrix type and dimensions**
2. **Choose appropriate method** (inverse, determinant, row operations)
3. **Show all steps clearly**
4. **Check answer** when possible (e.g., verify $$AA^{-1} = I$$)

#### **13.3 Quick Checks**

1. **Square matrix needed** for inverse, determinant, eigenvalues
2. $$\det(A) \neq 0$$ for invertibility
3. **Dimensions must match** for matrix operations
4. **Eigenvectors** are never zero vectors

This comprehensive theory covers all aspects of matrices and determinants with detailed explanations and examples, providing complete preparation for the entrance examination.
