# Cramer's Rule

## Theorem

Let $n \in \N$.

Let $b_1, b_2, \dots, b_n$ be real numbers.

Let $\mathbf b = \tuple {b_1, b_2, \dots, b_n}^T$.

Let $x_1, x_2, \dots, x_n$ be real numbers.

Let $\mathbf x = \tuple {x_1, x_2, \dots, x_n}^T$.

Let $A$ be an invertible $n \times n$ matrix with coefficients in $\R$.

For each $i \in \set {1, \dots, n}$, let $A_i$ be the matrix obtained by replacing the $i$th column with $\mathbf b$.

Let:

$A \mathbf x = \mathbf b$

Then:

$\mathbf x_i = \dfrac {\map \det {A_i} } {\map \det A}$

for each $i \in \set {1, \dots, n}$.

## Proof

Let $C$ be the cofactor matrix of $A$. Then by definition of adjugate matrix, $C^T$ is the adjugate matrix of $A$.

Therefore,

 $\ds A \cdot C^T$ $=$ $\ds \det{A} \cdot I_n$ Matrix Product with Adjugate Matrix.

Because $A$ is invertible, $A^{-1}$ exists.

Because $A$ is invertible, $1/\det{A}$ exists by Matrix is Invertible iff Determinant has Multiplicative Inverse.

Therefore:

$A^{-1} = \frac{1}{\det{A}}\cdot C^T.$

Since $A \mathbf x = \mathbf b$ by hypothesis,

$\mathbf x = A^{-1} \mathbf b.$

Therefore,

$\mathbf x = \left(\frac{1}{\det{A}}\cdot C^T\right) \mathbf b.$

For each $r,s \in \{1, \dots, n\}$, let $A_{rs}$ denote the element of $C$ whose index is $(r,s)$. Then by the definition of transpose,

$C^T = \begin{bmatrix} A_{11} & A_{21} & \dots & A_{n1} \\ A_{12} & A_{22} & \dots & A_{n2} \\ \vdots & \vdots & \ddots & \vdots \\ A_{1n} & A_{2n} & \dots & A_{nn} \end{bmatrix}.$

By the definition of cofactor matrix, $A_{rs}$ is the cofactor of the element of $A$ whose index is $(r,s)$.

Let $i \in \{1, \dots, n\}$. Then by definition of matrix product,

$x_i = \frac{1}{\det{A}}\left(\sum_{j = 1}^n A_{ji} b_j\right).$

Recall that $A_i$ is the matrix obtained from $A$ by replacing the $i$-th column with $\mathbf{b}$. Then if

$A = \begin{bmatrix} a_{1,1} & a_{1,2} & \dots & a_{1,n} \\ a_{2,1} & a_{2,2} & \dots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n,1} & a_{n,2} & \dots & a_{n,n} \end{bmatrix}$

then

$A_i = \begin{bmatrix} a_{1,1} & a_{1,2} & \dots & a_{1, i-1} & b_1 & a_{1, i+1} & \dots & a_{1,n} \\ a_{2,1} & a_{2,2} & \dots & a_{2, i-1} & b_2 & a_{2, i+1} & \dots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots & \vdots & \vdots & \ddots & \vdots \\ a_{n,1} & a_{n,2} & \dots & a_{n, i-1} & b_n & a_{n, i+1} & \dots & a_{n,n} \end{bmatrix}.$

For all $j \in \{1, \dots, n\}$ the matrix obtained by deleting the $j$-th row and the $i$-th column of $A$ is equal to the matrix obtained by deleting the $j$-th row and $i$-th column of $A_i$. Therefore, by the definition of a cofactor, the cofactor of $(A_i)_{j,i}$ is equal to $A_{ji}$.

By the Expansion Theorem for Determinants, we now have

$\det{A_i} = \sum_{j=1}^n A_{ji} b_j.$

Therefore,

$x_i = \frac{\det{A_i}}{\det{A}}$

as desired.

## Source of Name

This entry was named for Gabriel Cramer.