# Determinant of Matrix Product

## Theorem

Let $\mathbf A = \left[{a}\right]_n$ and $\mathbf B = \left[{b}\right]_n$ be a square matrices of order $n$.

Let $\det \left({\mathbf A}\right)$ be the determinant of $\mathbf A$.

Let $\mathbf A \mathbf B$ be the (conventional) matrix product of $\mathbf A$ and $\mathbf B$.

Then:

$\det \left({\mathbf A \mathbf B}\right) = \det \left({\mathbf A}\right) \det \left({\mathbf B}\right)$

That is, the determinant of the product is equal to the product of the determinants.

## Proof 1

This proof assumes that $\mathbf A$ and $\mathbf B$ are $n \times n$-matrices over a commutative ring with unity $\left({R, +, \circ}\right)$.

Let $\mathbf C = \left[{c}\right]_n = \mathbf A \mathbf B$.

From Square Matrix Row Equivalent to Triangular Matrix, it follows that $\mathbf A$ can be converted into a upper triangular matrix $\mathbf A'$ by a finite sequence of elementary row operations $\hat o_1, \ldots, \hat o_{m'}$.

Let $\mathbf C'$ denote the matrix that results from using $\hat o_1, \ldots, \hat o_{m'}$ on $\mathbf C$.

From Elementary Row Operations Commute with Matrix Multiplication, it follows that $\mathbf C' = \mathbf A' \mathbf B$.

Effect of Sequence of Elementary Row Operations on Determinant shows that there exists $\alpha \in R$ such that:

$\alpha \det \left({\mathbf A'}\right) = \det \left({\mathbf A}\right)$
$\alpha \det \left({\mathbf C'}\right) = \det \left({\mathbf C}\right)$

Let $\mathbf B^\intercal$ be the transpose of $B$.

From Transpose of Matrix Product, it follows that:

$\left({\mathbf C'}\right)^\intercal = \left({\mathbf A' \mathbf B}\right)^\intercal = \mathbf B^\intercal \left({\mathbf A'}\right)^\intercal$

From Square Matrix Row Equivalent to Triangular Matrix, it follows that $\mathbf B^\intercal$ can be converted into a lower triangular matrix $\left({\mathbf B^\intercal}\right)'$ by a finite sequence of elementary row operations $\hat p_1, \ldots, \hat p_{m''}$.

Let $\mathbf C''$ denote the matrix that results from using $\hat p_1, \ldots, \hat p_{m''}$ on $\left({\mathbf C'}\right)^\intercal$.

From Elementary Row Operations Commute with Matrix Multiplication, it follows that:

$\mathbf C'' = \left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal$

Effect of Sequence of Elementary Row Operations on Determinant shows that there exists $\beta \in R$ such that:

$\beta \det \left({\left({\mathbf B^\intercal}\right)'}\right) = \det \left({\mathbf B^\intercal}\right)$
$\beta \det \left({\mathbf C''}\right) = \det \left({ \left({\mathbf C'}\right)^\intercal }\right)$

From Transpose of Upper Triangular Matrix is Lower Triangular, it follows that $\left({\mathbf A'}\right)^\intercal$ is a lower triangular matrix.

Then Product of Triangular Matrices shows that $\left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal$ is a lower triangular matrix whose diagonal elements are the products of the diagonal elements of $\left({\mathbf B^\intercal}\right)'$ and $\left({\mathbf A'}\right)^\intercal$.

From Determinant of Triangular Matrix, we have that $\det \left({\left({\mathbf A'}\right)^\intercal}\right)$, $\det \left({\left({\mathbf B^\intercal}\right)' }\right)$, and $\det \left({\left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal }\right)$ are equal to the product of their diagonal elements.

Combinining these results shows that:

$\det \left({\left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal}\right) = \det \left({\left({\mathbf B^\intercal}\right)'}\right) \det \left({\left({\mathbf A'}\right)^\intercal }\right)$

Then:

 $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle \det \left({\mathbf C}\right)$$ $$=$$ $$\displaystyle$$  $$\displaystyle$$ $$\displaystyle \alpha \det \left({\mathbf C'}\right)$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$=$$ $$\displaystyle$$  $$\displaystyle$$ $$\displaystyle \alpha \det \left({ \left({\mathbf C'}\right)^\intercal}\right)$$ $$\displaystyle$$ $$\displaystyle$$ by Determinant of Transpose $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$=$$ $$\displaystyle$$  $$\displaystyle$$ $$\displaystyle \alpha \beta \det \left({\mathbf C''}\right)$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$=$$ $$\displaystyle$$  $$\displaystyle$$ $$\displaystyle \alpha \beta \det \left({ \left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal}\right)$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$=$$ $$\displaystyle$$  $$\displaystyle$$ $$\displaystyle \alpha \beta \det \left({\left({\mathbf B^\intercal}\right)' }\right) \det \left({\left({\mathbf A'}\right)^\intercal}\right)$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$=$$ $$\displaystyle$$  $$\displaystyle$$ $$\displaystyle \alpha \det \left({\left({\mathbf A'}\right)^\intercal}\right) \beta \det \left({\left({\mathbf B^\intercal}\right)' }\right)$$ $$\displaystyle$$ $$\displaystyle$$ by commutativity of the ring product in $R$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$=$$ $$\displaystyle$$  $$\displaystyle$$ $$\displaystyle \alpha \det \left({\mathbf A'}\right) \det \left({\mathbf B^\intercal}\right)$$ $$\displaystyle$$ $$\displaystyle$$ partly by Determinant of Transpose $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$\displaystyle$$ $$=$$ $$\displaystyle$$  $$\displaystyle$$ $$\displaystyle \det \left({\mathbf A}\right) \det \left({\mathbf B}\right)$$ $$\displaystyle$$ $$\displaystyle$$ partly by Determinant of Transpose

$\blacksquare$

## Proof 2

Consider two cases:

$(1): \quad \mathbf A$ is not invertible.
$(2): \quad \mathbf A$ is invertible.

Proof of case $(1)$:

Assume $\mathbf A$ is not invertible.

Then $\det \left({\mathbf A}\right) = 0$.

Also if $\mathbf A$ is not invertible then neither is $\mathbf A \mathbf B$, and so

$\det\left({\mathbf A \mathbf B}\right) = 0$

Thus:

$0 = 0 \cdot \det\left({\mathbf B}\right)$
$\det \left({\mathbf A\mathbf B}\right) = \det \left({\mathbf A}\right) \cdot \det\left({\mathbf B}\right)$

Proof of case $(2)$:

Assume $\mathbf A$ is invertible.

Then $\mathbf A$ is a product of elementary matrices, $\mathbf E$.

Let $\mathbf A = \mathbf E^{k} \mathbf E^{k-1} \cdots \mathbf E^{1}$.

So:

$\det \left({\mathbf A\mathbf B}\right) = \det \left({\mathbf E^{k}\mathbf E^{k-1} \cdots \mathbf E^{1} \mathbf B}\right)$

But for any matrix $\mathbf D$:

$\det \left({\mathbf E \mathbf D}\right) = \det \left({\mathbf E}\right) \cdot \det\left({\mathbf D}\right)$

Therefore:

$\det \left({\mathbf A \mathbf B}\right) = \det \left({\mathbf E^{k}}\right) \det\left({\mathbf E^{k-1}}\right) \cdots \det\left({\mathbf E^{1}}\right) \det\left({\mathbf B}\right)$
$\det\left({\mathbf A \mathbf B}\right) = \det\left({\mathbf E^{k} \mathbf E^{k-1} \ldots \mathbf E^{1}}\right) \det \left({\mathbf B}\right)$
$\det \left({\mathbf A \mathbf B}\right) = \det\left({\mathbf A}\right) \cdot \det\left({\mathbf B}\right)$

as required.

$\blacksquare$