The determinant of a matrix represents the *scaling factor* applied by the transformation described by the matrix. Geometrically, it measures how the transformation stretches or compresses areas ("2D"), volumes ("3D") or hyperplanes.
For example, consider a transformation of the unit square in two dimensions:
$ \begin{bmatrix}1 & 0 \\ 0 & 1 \end{bmatrix} \to \begin{bmatrix}a & b \\ c & d \end{bmatrix} $
The determinant for a $2 \times 2$ matrix is computes as:
$ A= \begin{bmatrix} a & b\\ c & d \end{bmatrix} \implies \det (A)=ad-bc $
This value corresponds to the area of the parallelogram formed by the transformed unit square.
![[matrix-determinant.png|center|300]]
## Derivation
The determinant can be derived from the [[Inverse Matrix#^81f95f|matrix inverse property]] $A \cdot A^{-1}=\mathbf I$, where $\mathbf I$ is the identity matrix. To get from $A$ to the identity $\mathbf I$, we take the inverse directions and scale them according to the determinant.
$
A= \begin{bmatrix} a&b \\ c&d \end{bmatrix}, \qquad
A^{-1}=\frac{1}{\det (A)} \begin{bmatrix} d&-b \\ -c&a \end{bmatrix}$
Compute $A \cdot A^{-1}$:
$
A \cdot A^{-1} =\begin{bmatrix} a&b \\ c&d \end{bmatrix} \cdot
\frac{1}{\det (A)} \begin{bmatrix} d&-b \\ -c&a \end{bmatrix}
$
Resulting in:
$ A \cdot A^{-1} =
\frac{1}{\det (A)} \cdot
\begin{bmatrix} ad-bc & 0 \\ 0 & ab-bc \end{bmatrix} $
Since this expression needs equal the identity matrix, we need to normalize diagonal elements to $1$.
$ \frac{1}{\det (A)}*(ad-bc)=1 \quad \implies \quad \det(A) = ad-bc$
>[!note:]
>Finding determinants for bigger than $2 \times 2$ matrices is more complicated, but can be done by software programs.
## Linear Independence
The determinant provides insights into the linear independence of the matrix's column vectors:
- $\det(A)=0$: Vectors are *linearly dependent*, meaning one vector is a linear combination of others. Such a matrix collapses a dimension during transformation and does not have an inverse.
- $\det(A) \neq 0$: Vectors are [[Linearly Independent Vectors|linearly independent]], and the transformation can be reversed using the inverse matrix.
Example: Consider a degenerate matrix
$
a_1= \begin{bmatrix} 1 \\ 1 \end{bmatrix}, \quad
a_2= \begin{bmatrix} 2 \\ 2 \end{bmatrix}, \quad
A= \begin{bmatrix} 1 & 2 \\ 1& 2 \end{bmatrix} $
Compute the determinant:
$ \det(A)=(1_2)-(1_2)=0 $
Since the determinant is $0$, the column vectors $a_1$ and $a_2$ are linearly dependent, and $A$ is not invertible.