Inverse of Invertible 2 x 2 Real Square Matrix

Theorem
Let $\mathbf A$ be an invertible $2 \times 2$ real square matrix defined as:


 * $\mathbf A = \begin{pmatrix}

a & b \\ c & d \end{pmatrix}$

Then its inverse matrix $\mathbf A^{-1}$ is:


 * $\mathbf A^{-1} = \dfrac 1 {\map \det {\mathbf A}} \begin {pmatrix}

d & -b \\ -c & a \end {pmatrix}$

Proof
We construct $\begin {pmatrix} \mathbf A & \mathbf I \end {pmatrix}$:


 * $\begin {pmatrix} \mathbf A & \mathbf I \end {pmatrix} = \paren {\begin {array} {cc|cc}

a & b & 1 & 0 \\ c & d & 0 & 1 \\ \end {array} }$

In the following, $\sequence {e_n}_{n \mathop \ge 1}$ denotes the sequence of elementary row operations that are to be applied to $\begin {pmatrix} \mathbf A & \mathbf I \end {pmatrix}$.

The matrix that results from having applied $e_1$ to $e_k$ in order is denoted $\begin {pmatrix} \mathbf A_k & \mathbf B_k \end {pmatrix}$.

It is seen that $\begin {pmatrix} \mathbf A_4 & \mathbf B_4 \end {pmatrix}$ is the required reduced echelon form:
 * $\mathbf A_4 = \mathbf I$

Hence by the Matrix Inverse Algorithm:

Hence the result.