# 2.6 MCQs-Matrices and Determinants

## MCQs

***

### Matrices and Determinants

#### Basic Concepts and Definitions

1\. A matrix is defined as:

1. A rectangular array of numbers arranged in rows and columns
2. A single column of numbers
3. A square table of numbers
4. A collection of mathematical functions

<details>

<summary>Show me the answer</summary>

**Answer:** 1. A rectangular array of numbers arranged in rows and columns

**Explanation:** A matrix is a rectangular arrangement of numbers (called elements or entries) organized in rows and columns. The size of a matrix is described by its order: $$m \times n$$, where $$m$$ is the number of rows and $$n$$ is the number of columns.

Example: A $$2 \times 3$$ matrix:

</details>

2\. The element $$a\_{23}$$ in matrix $$A$$ refers to:

1. Element in row 2, column 3
2. Element in row 3, column 2
3. The sum of elements in row 2 and column 3
4. The product of elements in row 2 and column 3

<details>

<summary>Show me the answer</summary>

**Answer:** 1. Element in row 2, column 3

**Explanation:** In matrix notation, $$a\_{ij}$$ represents the element located at the i-th row and j-th column. Therefore, $$a\_{23}$$ is the element in the second row and third column.

Example: If $$A = \begin{pmatrix} 1 & 2 & 3 \ 4 & 5 & 6 \end{pmatrix}$$, then $$a\_{23} = 6$$.

</details>

3\. A square matrix is one where:

1. All elements are equal
2. Number of rows equals number of columns
3. Number of rows is greater than number of columns
4. Number of columns is greater than number of rows

<details>

<summary>Show me the answer</summary>

**Answer:** 2. Number of rows equals number of columns

**Explanation:** A square matrix has the same number of rows and columns ($$m = n$$). The order of a square matrix is typically written as $$n \times n$$ or simply as order $$n$$.

Example: A $$3 \times 3$$ square matrix:

</details>

4\. The identity matrix of order 3 is:

1. $$\begin{pmatrix} 0 & 0 & 0 \ 0 & 0 & 0 \ 0 & 0 & 0 \end{pmatrix}$$
2. $$\begin{pmatrix} 1 & 1 & 1 \ 1 & 1 & 1 \ 1 & 1 & 1 \end{pmatrix}$$
3. $$\begin{pmatrix} 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 \end{pmatrix}$$
4. $$\begin{pmatrix} 0 & 1 & 0 \ 1 & 0 & 1 \ 0 & 1 & 0 \end{pmatrix}$$

<details>

<summary>Show me the answer</summary>

**Answer:** 3. $$\begin{pmatrix} 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 \end{pmatrix}$$

**Explanation:** The identity matrix, denoted by $$I\_n$$, is a square matrix with 1's on the main diagonal and 0's elsewhere. For any square matrix $$A$$ of the same order, $$AI = IA = A$$.

The general form of an $$n \times n$$ identity matrix is:

</details>

#### Types of Matrices

5\. A diagonal matrix is:

1. A matrix with non-zero elements only on the main diagonal
2. A matrix with all elements equal
3. A matrix with zeros on the main diagonal
4. A matrix with all elements non-zero

<details>

<summary>Show me the answer</summary>

**Answer:** 1. A matrix with non-zero elements only on the main diagonal

**Explanation:** A diagonal matrix is a square matrix where all elements outside the main diagonal are zero. The elements on the main diagonal can be zero or non-zero.

Example of a $$3 \times 3$$ diagonal matrix:

</details>

6\. Which matrix is both symmetric and skew-symmetric?

1. Identity matrix
2. Zero matrix
3. Diagonal matrix
4. No such matrix exists

<details>

<summary>Show me the answer</summary>

**Answer:** 2. Zero matrix

**Explanation:**

* A symmetric matrix satisfies $$A^T = A$$.
* A skew-symmetric matrix satisfies $$A^T = -A$$.
* For a matrix to be both symmetric and skew-symmetric, we must have $$A = A^T$$ and $$A = -A^T$$, which implies $$A = -A$$, so $$2A = 0$$, thus $$A = 0$$.
* Therefore, only the zero matrix satisfies both conditions.

</details>

7\. A matrix $$A$$ is symmetric if:

1. $$A = -A^T$$
2. $$A = A^T$$
3. $$A = A^{-1}$$
4. $$A = -A$$

<details>

<summary>Show me the answer</summary>

**Answer:** 2. $$A = A^T$$

**Explanation:** A symmetric matrix is equal to its transpose. This means that for all $$i$$ and $$j$$, $$a\_{ij} = a\_{ji}$$. Symmetric matrices are always square.

Example of a symmetric matrix:

Note that $$s\_{12} = s\_{21} = 2$$, $$s\_{13} = s\_{31} = 3$$, and $$s\_{23} = s\_{32} = 5$$.

</details>

#### Matrix Operations

8\. For two matrices $$A$$ and $$B$$ to be added, they must:

1. Have the same number of elements
2. Have the same order (same number of rows and columns)
3. Both be square matrices
4. Have the same determinant

<details>

<summary>Show me the answer</summary>

**Answer:** 2. Have the same order (same number of rows and columns)

**Explanation:** Matrix addition is defined only for matrices of the same order. If $$A$$ is $$m \times n$$ and $$B$$ is $$p \times q$$, then $$A + B$$ exists only if $$m = p$$ and $$n = q$$.

Example: If $$A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$$ and $$B = \begin{pmatrix} 5 & 6 \ 7 & 8 \end{pmatrix}$$, then:

</details>

9\. If $$A$$ is a $$2 \times 3$$ matrix and $$B$$ is a $$3 \times 4$$ matrix, then the product $$AB$$:

1. Is a $$2 \times 4$$ matrix
2. Is a $$3 \times 3$$ matrix
3. Is a $$2 \times 3$$ matrix
4. Does not exist

<details>

<summary>Show me the answer</summary>

**Answer:** 1. Is a $$2 \times 4$$ matrix

**Explanation:** For matrix multiplication $$AB$$ to be defined, the number of columns in $$A$$ must equal the number of rows in $$B$$. If $$A$$ is $$m \times n$$ and $$B$$ is $$n \times p$$, then $$AB$$ is $$m \times p$$.

Here, $$A$$ is $$2 \times 3$$ and $$B$$ is $$3 \times 4$$, so:

* Number of columns in $$A$$ = 3
* Number of rows in $$B$$ = 3 ✓ (condition satisfied)
* Resulting matrix $$AB$$ will be $$2 \times 4$$.

</details>

10\. The transpose of matrix $$A = \begin{pmatrix} 1 & 2 & 3 \ 4 & 5 & 6 \end{pmatrix}$$ is:

1. $$\begin{pmatrix} 1 & 4 \ 2 & 5 \ 3 & 6 \end{pmatrix}$$
2. $$\begin{pmatrix} 1 & 2 \ 3 & 4 \ 5 & 6 \end{pmatrix}$$
3. $$\begin{pmatrix} 1 & 2 & 3 \ 4 & 5 & 6 \end{pmatrix}$$
4. $$\begin{pmatrix} 6 & 5 & 4 \ 3 & 2 & 1 \end{pmatrix}$$

<details>

<summary>Show me the answer</summary>

**Answer:** 1. $$\begin{pmatrix} 1 & 4 \ 2 & 5 \ 3 & 6 \end{pmatrix}$$

**Explanation:** The transpose of a matrix $$A$$, denoted $$A^T$$, is obtained by interchanging rows and columns. If $$A$$ is $$m \times n$$, then $$A^T$$ is $$n \times m$$.

For the given matrix:

Its transpose is:

</details>

11\. For any square matrix $$A$$, the trace is defined as:

1. The product of diagonal elements
2. The sum of all elements
3. The sum of diagonal elements
4. The determinant of the matrix

<details>

<summary>Show me the answer</summary>

**Answer:** 3. The sum of diagonal elements

**Explanation:** The trace of a square matrix $$A$$, denoted tr($$A$$), is the sum of its diagonal elements. For an $$n \times n$$ matrix $$A = \[a\_{ij}]$$:

Example: For $$A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$$, tr($$A$$) = 1 + 4 = 5.

</details>

#### Determinants

12\. The determinant of a $$2 \times 2$$ matrix $$A = \begin{pmatrix} a & b \ c & d \end{pmatrix}$$ is:

1. $$ad + bc$$
2. $$ad - bc$$
3. $$ab - cd$$
4. $$ac - bd$$

<details>

<summary>Show me the answer</summary>

**Answer:** 2. $$ad - bc$$

**Explanation:** For a $$2 \times 2$$ matrix, the determinant is calculated as:

Example:

</details>

13\. The determinant of a $$3 \times 3$$ matrix $$A = \begin{pmatrix} a & b & c \ d & e & f \ g & h & i \end{pmatrix}$$ using the first row is:

1. $$a(ei - fh) - b(di - fg) + c(dh - eg)$$
2. $$a(ei - fh) + b(di - fg) + c(dh - eg)$$
3. $$a(ei - fh) - b(di - fg) - c(dh - eg)$$
4. $$a(ei + fh) - b(di + fg) + c(dh + eg)$$

<details>

<summary>Show me the answer</summary>

**Answer:** 1. $$a(ei - fh) - b(di - fg) + c(dh - eg)$$

**Explanation:** The determinant of a $$3 \times 3$$ matrix can be expanded along any row or column. Expanding along the first row:

</details>

14\. If all elements of a row (or column) of a square matrix are zeros, then its determinant is:

1. 1
2. 0
3. The product of diagonal elements
4. Cannot be determined

<details>

<summary>Show me the answer</summary>

**Answer:** 2. 0

**Explanation:** If any row or column of a matrix consists entirely of zeros, then the determinant of that matrix is zero. This property can be understood by expanding the determinant along that row or column.

Example:

Because expanding along the second row gives: $$0 \times (\text{some cofactor}) + 0 \times (\text{some cofactor}) + 0 \times (\text{some cofactor}) = 0$$.

</details>

15\. If two rows (or columns) of a square matrix are identical, then its determinant is:

1. 1
2. 0
3. Twice the value of the determinant
4. The square of the determinant

<details>

<summary>Show me the answer</summary>

**Answer:** 2. 0

**Explanation:** If two rows or two columns of a square matrix are identical, then its determinant is zero. This property comes from the fact that swapping two identical rows doesn't change the matrix, but swapping any two rows changes the sign of the determinant. Thus, $$\det(A) = -\det(A)$$, which implies $$\det(A) = 0$$.

Example:

</details>

#### Properties of Determinants

16\. If each element of a row (or column) of a determinant is multiplied by a constant $$k$$, then the value of the determinant:

1. Becomes $$k$$ times the original determinant
2. Becomes $$1/k$$ times the original determinant
3. Remains unchanged
4. Becomes $$k^2$$ times the original determinant

<details>

<summary>Show me the answer</summary>

**Answer:** 1. Becomes $$k$$ times the original determinant

**Explanation:** If each element of a row (or column) is multiplied by $$k$$, the determinant gets multiplied by $$k$$. This is because the determinant is a linear function of each row/column separately.

Example: If $$D = \begin{vmatrix} a & b \ c & d \end{vmatrix} = ad - bc$$, then:

</details>

17\. If any two rows (or columns) of a determinant are interchanged, then the value of the determinant:

1. Remains the same
2. Changes sign
3. Becomes zero
4. Becomes doubled

<details>

<summary>Show me the answer</summary>

**Answer:** 2. Changes sign

**Explanation:** Interchanging any two rows (or columns) of a determinant changes its sign. This property is fundamental to the alternating nature of determinants.

Example: Let $$D = \begin{vmatrix} a & b \ c & d \end{vmatrix} = ad - bc$$. If we interchange rows:

</details>

18\. The determinant of a matrix and its transpose are:

1. Negatives of each other
2. Reciprocals of each other
3. Equal
4. Unrelated

<details>

<summary>Show me the answer</summary>

**Answer:** 3. Equal

**Explanation:** For any square matrix $$A$$, the determinant of $$A$$ equals the determinant of its transpose: $$\det(A) = \det(A^T)$$.

Example:

Both determinants are equal to -2.

</details>

19\. If $$A$$ and $$B$$ are square matrices of the same order, then $$\det(AB)$$ equals:

1. $$\det(A) + \det(B)$$
2. $$\det(A) - \det(B)$$
3. $$\det(A) \times \det(B)$$
4. $$\det(A) / \det(B)$$

<details>

<summary>Show me the answer</summary>

**Answer:** 3. $$\det(A) \times \det(B)$$

**Explanation:** This is the multiplicative property of determinants. For square matrices $$A$$ and $$B$$ of the same order: $$\det(AB) = \det(A) \times \det(B)$$.

Example:

Indeed, $$\det(A) \times \det(B) = 1 \times 6 = 6 = \det(AB)$$.

</details>

#### Matrix Inverses

20\. A square matrix $$A$$ is invertible (non-singular) if and only if:

1. $$\det(A) = 0$$
2. $$\det(A) \neq 0$$
3. $$A$$ is symmetric
4. $$A$$ is diagonal

<details>

<summary>Show me the answer</summary>

**Answer:** 2. $$\det(A) \neq 0$$

**Explanation:** A square matrix $$A$$ has an inverse (denoted $$A^{-1}$$) if and only if its determinant is non-zero. Such matrices are called non-singular or invertible. If $$\det(A) = 0$$, the matrix is singular and does not have an inverse.

The inverse satisfies: $$AA^{-1} = A^{-1}A = I$$, where $$I$$ is the identity matrix.

</details>

21\. The inverse of a $$2 \times 2$$ matrix $$A = \begin{pmatrix} a & b \ c & d \end{pmatrix}$$ is:

1. $$\frac{1}{ad-bc} \begin{pmatrix} d & -b \ -c & a \end{pmatrix}$$
2. $$\frac{1}{ad-bc} \begin{pmatrix} -d & b \ c & -a \end{pmatrix}$$
3. $$\frac{1}{ad+bc} \begin{pmatrix} d & -b \ -c & a \end{pmatrix}$$
4. $$\frac{1}{ad-bc} \begin{pmatrix} d & b \ c & a \end{pmatrix}$$

<details>

<summary>Show me the answer</summary>

**Answer:** 1. $$\frac{1}{ad-bc} \begin{pmatrix} d & -b \ -c & a \end{pmatrix}$$

**Explanation:** For a $$2 \times 2$$ matrix, the inverse formula is:

This formula works only when $$\det(A) = ad - bc \neq 0$$.

Verification: $$AA^{-1} = \begin{pmatrix} a & b \ c & d \end{pmatrix} \cdot \frac{1}{ad-bc} \begin{pmatrix} d & -b \ -c & a \end{pmatrix} = \frac{1}{ad-bc} \begin{pmatrix} ad-bc & 0 \ 0 & ad-bc \end{pmatrix} = \begin{pmatrix} 1 & 0 \ 0 & 1 \end{pmatrix}$$.

</details>

22\. If $$A$$ is an invertible matrix, then $$(A^{-1})^{-1}$$ equals:

1. $$A$$
2. $$A^T$$
3. $$I$$
4. $$-A$$

<details>

<summary>Show me the answer</summary>

**Answer:** 1. $$A$$

**Explanation:** The inverse of an inverse gives back the original matrix. This is similar to how $$(x^{-1})^{-1} = x$$ for non-zero numbers.

Formally, if $$A$$ is invertible, then $$A^{-1}$$ exists and $$AA^{-1} = A^{-1}A = I$$. By definition, $$(A^{-1})^{-1}$$ is the matrix that when multiplied by $$A^{-1}$$ gives $$I$$. Since $$A(A^{-1}) = I$$, we have $$(A^{-1})^{-1} = A$$.

</details>

23\. For invertible matrices $$A$$ and $$B$$ of the same order, $$(AB)^{-1}$$ equals:

1. $$A^{-1}B^{-1}$$
2. $$B^{-1}A^{-1}$$
3. $$AB$$
4. $$BA$$

<details>

<summary>Show me the answer</summary>

**Answer:** 2. $$B^{-1}A^{-1}$$

**Explanation:** The inverse of a product is the product of the inverses in reverse order. To verify:

Similarly:

Thus, $$B^{-1}A^{-1}$$ is indeed the inverse of $$AB$$.

</details>

#### Solving Linear Equations

24\. The system of linear equations $$AX = B$$ has a unique solution if:

1. $$\det(A) = 0$$
2. $$\det(A) \neq 0$$
3. $$A$$ is a square matrix
4. $$B$$ is the zero vector

<details>

<summary>Show me the answer</summary>

**Answer:** 2. $$\det(A) \neq 0$$

**Explanation:** For a system of linear equations $$AX = B$$, where $$A$$ is an $$n \times n$$ coefficient matrix:

* If $$\det(A) \neq 0$$, then $$A$$ is invertible, and the system has a unique solution: $$X = A^{-1}B$$.
* If $$\det(A) = 0$$, then either there is no solution or infinitely many solutions.

This result is known as Cramer's Rule when applied to each variable individually.

</details>

25\. Using Cramer's Rule, the solution for $$x$$ in the system: $$a\_1x + b\_1y = c\_1$$ $$a\_2x + b\_2y = c\_2$$ is:

1. $$\frac{\begin{vmatrix} c\_1 & b\_1 \ c\_2 & b\_2 \end{vmatrix}}{\begin{vmatrix} a\_1 & b\_1 \ a\_2 & b\_2 \end{vmatrix}}$$
2. $$\frac{\begin{vmatrix} a\_1 & c\_1 \ a\_2 & c\_2 \end{vmatrix}}{\begin{vmatrix} a\_1 & b\_1 \ a\_2 & b\_2 \end{vmatrix}}$$
3. $$\frac{\begin{vmatrix} a\_1 & b\_1 \ a\_2 & b\_2 \end{vmatrix}}{\begin{vmatrix} c\_1 & b\_1 \ c\_2 & b\_2 \end{vmatrix}}$$
4. $$\frac{\begin{vmatrix} b\_1 & c\_1 \ b\_2 & c\_2 \end{vmatrix}}{\begin{vmatrix} a\_1 & b\_1 \ a\_2 & b\_2 \end{vmatrix}}$$

<details>

<summary>Show me the answer</summary>

**Answer:** 1. $$\frac{\begin{vmatrix} c\_1 & b\_1 \ c\_2 & b\_2 \end{vmatrix}}{\begin{vmatrix} a\_1 & b\_1 \ a\_2 & b\_2 \end{vmatrix}}$$

**Explanation:** Cramer's Rule states that for the system: $$a\_1x + b\_1y = c\_1$$ $$a\_2x + b\_2y = c\_2$$

Let $$D = \begin{vmatrix} a\_1 & b\_1 \ a\_2 & b\_2 \end{vmatrix}$$ (determinant of the coefficient matrix).

Let $$D\_x = \begin{vmatrix} c\_1 & b\_1 \ c\_2 & b\_2 \end{vmatrix}$$ (replace the x-coefficients with constants).

Let $$D\_y = \begin{vmatrix} a\_1 & c\_1 \ a\_2 & c\_2 \end{vmatrix}$$ (replace the y-coefficients with constants).

If $$D \neq 0$$, then:

</details>

#### Special Matrices and Properties

26\. An orthogonal matrix satisfies:

1. $$A^T = A$$
2. $$A^T = -A$$
3. $$A^T = A^{-1}$$
4. $$A^2 = A$$

<details>

<summary>Show me the answer</summary>

**Answer:** 3. $$A^T = A^{-1}$$

**Explanation:** An orthogonal matrix is a square matrix whose transpose equals its inverse: $$A^T = A^{-1}$$. Equivalently, $$AA^T = A^TA = I$$.

Properties of orthogonal matrices:

1. The columns (and rows) are orthonormal vectors (unit vectors that are mutually perpendicular).
2. $$\det(A) = \pm 1$$.
3. Preserves lengths and angles: for any vector $$x$$, $$|Ax| = |x|$$.

Example: Rotation matrices are orthogonal.

</details>

27\. For an orthogonal matrix $$A$$, the determinant $$\det(A)$$ equals:

1. 0
2. 1
3. -1
4. 1 or -1

<details>

<summary>Show me the answer</summary>

**Answer:** 4. 1 or -1

**Explanation:** For an orthogonal matrix $$A$$, we have $$AA^T = I$$. Taking determinants on both sides: $$\det(AA^T) = \det(I)$$ $$\det(A)\det(A^T) = 1$$ Since $$\det(A^T) = \det(A)$$, we get: $$\[\det(A)]^2 = 1$$ Thus, $$\det(A) = \pm 1$$.

Orthogonal matrices with determinant +1 are called proper orthogonal matrices (rotations). Those with determinant -1 are called improper orthogonal matrices (reflections).

</details>

28\. A nilpotent matrix is one where:

1. $$A^2 = A$$
2. $$A^2 = I$$
3. $$A^k = 0$$ for some positive integer $$k$$
4. $$A^T = A$$

<details>

<summary>Show me the answer</summary>

**Answer:** 3. $$A^k = 0$$ for some positive integer $$k$$

**Explanation:** A nilpotent matrix is a square matrix $$A$$ such that $$A^k = 0$$ for some positive integer $$k$$. The smallest such $$k$$ is called the index of nilpotency.

Properties:

1. All eigenvalues of a nilpotent matrix are 0.
2. The determinant and trace are both 0.
3. The only nilpotent matrix that is diagonalizable is the zero matrix.

Example: $$A = \begin{pmatrix} 0 & 1 \ 0 & 0 \end{pmatrix}$$ is nilpotent because $$A^2 = \begin{pmatrix} 0 & 0 \ 0 & 0 \end{pmatrix} = 0$$.

</details>

29\. The adjoint (adjugate) of a matrix $$A$$ is related to its inverse by:

1. $$A^{-1} = \frac{1}{\det(A)} \cdot \text{adj}(A)$$
2. $$A^{-1} = \det(A) \cdot \text{adj}(A)$$
3. $$A^{-1} = \text{adj}(A)$$
4. $$A^{-1} = \frac{\text{adj}(A)}{\det(A)}$$

<details>

<summary>Show me the answer</summary>

**Answer:** 1. $$A^{-1} = \frac{1}{\det(A)} \cdot \text{adj}(A)$$

**Explanation:** The adjoint (or adjugate) of a square matrix $$A$$, denoted adj($$A$$), is the transpose of the cofactor matrix of $$A$$.

The relationship between the inverse and adjoint is:

This formula holds when $$\det(A) \neq 0$$.

Additionally, we have: $$A \cdot \text{adj}(A) = \text{adj}(A) \cdot A = \det(A) \cdot I$$.

</details>

30\. The rank of a matrix is:

1. The number of non-zero rows in its row echelon form
2. The number of columns in the matrix
3. The determinant of the matrix
4. The trace of the matrix

<details>

<summary>Show me the answer</summary>

**Answer:** 1. The number of non-zero rows in its row echelon form

**Explanation:** The rank of a matrix is the maximum number of linearly independent rows (or columns) in the matrix. Equivalently, it is the number of non-zero rows in its row echelon form or reduced row echelon form.

Properties:

1. Rank($$A$$) ≤ min(number of rows, number of columns)
2. Rank($$A$$) = Rank($$A^T$$)
3. For an $$n \times n$$ matrix $$A$$, if Rank($$A$$) = $$n$$, then $$A$$ is invertible (non-singular).

The rank gives important information about the solutions of linear systems and the invertibility of matrices.

</details>
