Determinant of Matrix Product

Theorem
Let $$\mathbf{A} = \left[{a}\right]_{n}$$ and $$\mathbf{B} = \left[{b}\right]_{n}$$ be a square matrices of order $n$.

Let $$\det \left({\mathbf{A}}\right)$$ be the determinant of $$\mathbf{A}$$.

Let $$\mathbf{A} \mathbf{B}$$ be the (conventional) matrix product of $$\mathbf{A}$$ and $$\mathbf{B}$$.

Then $$\det \left({\mathbf{A} \mathbf{B}}\right) = \det \left({\mathbf{A}}\right) \det \left({\mathbf{B}}\right)$$.

That is, the determinant of the product is equal to the product of the determinants.

Proof
Let $$\mathbf{C} = \left[{c}\right]_{n} = \mathbf{A} \mathbf{B}$$.

Thus $$\forall i, j \in \left[{1 \,. \, . \, n}\right]: c_{i j} = \sum_{l=1}^n a_{i l} b_{l j}$$

Then:

$$ $$

This gets messy very quickly, so we can try another approach.

From Square Matrix Row Equivalent to Triangular Matrix, you can turn a square matrix into a triangular matrix by using elementary row operations that, from Effect of Elementary Row Operations on Determinant, either don't change the value of its determinant or change its sign.

So suppose:
 * $$\mathbf{A}$$ is row equivalent to the upper triangular matrix $$\mathbf{T}_A$$;
 * $$\mathbf{B}$$ is row equivalent to the upper triangular matrix $$\mathbf{T}_B$$

such that:
 * $$\det \left({\mathbf{A}}\right) = \pm \det \left({\mathbf{T}_A}\right)$$;
 * $$\det \left({\mathbf{B}}\right) = \pm \det \left({\mathbf{T}_B}\right)$$.

As $$\det \left({\mathbf{T}_A}\right)$$ and $$\det \left({\mathbf{T}_B}\right)$$ are equal to the product of their diagonal elements from Determinant of a Triangular Matrix.

From Product of Triangular Matrices, it also follows that $$\mathbf{T}_A \mathbf{T}_B$$ is an upper triangular matrix whose diagonal elements are the products of the diagonal elements of $$\mathbf{T}_A$$ and $$\mathbf{T}_B$$.

Thus it follows that $$\det \left({\mathbf{T}_A \mathbf{T}_B}\right) = \det \left({\mathbf{T}_A}\right) \det \left({\mathbf{T}_B}\right)$$.

Thus it follows that $$\det \left({\mathbf{A} \mathbf{B}}\right) = \pm \det \left({\mathbf{A}}\right) \det \left({\mathbf{B}}\right)$$.

Alternative Proof
Consider two cases:

1) $$\mathbf{A}$$ is not invertible.

2) $$\mathbf{A}$$ is invertible.

Proof of case 1:

Assume $$\mathbf{A}$$ is not invertible.

Then $$ \det \left({\mathbf{A}}\right) = 0$$.

Also if $$\mathbf{A}$$ is not invertible then neither is $$\mathbf{A} \mathbf{B}$$, and so

$$ \det\left({\mathbf{A}\mathbf{B}}\right) = 0$$

Thus: $$0=0.\det\left({\mathbf{B}}\right)$$

$$\det\left({\mathbf{A}\mathbf{B}}\right) = \det\left({\mathbf{A}}\right).\det\left({\mathbf{B}}\right)$$

Proof of case 2:

Assume $$\mathbf{A}$$ is invertible; then $$\mathbf{A}$$ is a product of elementary matrices, $$\mathbf{E}$$.

Let $$\mathbf{A} = \mathbf{E}^{k}\mathbf{E}^{k-1}...\mathbf{E}^{1}$$

So $$\det\left({\mathbf{A}\mathbf{B}}\right)=\det\left({\mathbf{E}^{k}\mathbf{E}^{k-1}...\mathbf{E}^{1}\mathbf{B}}\right)$$

But for any matrix $$\mathbf{D}$$:
 * $$\det\left({\mathbf{E}\mathbf{D}}\right)=\det\left({\mathbf{E}}\right).\det\left({\mathbf{D}}\right)$$

by Effect of Elementary Row Operations on Determinant.

Therefore $$\det\left({\mathbf{A}\mathbf{B}}\right)=\det\left({\mathbf{E}^{k}}\right)\det\left({\mathbf{E}^{k-1}}\right)...\det\left({\mathbf{E}^{1}}\right)\det\left({\mathbf{B}}\right)$$

$$\det\left({\mathbf{A}\mathbf{B}}\right)=\det\left({\mathbf{E}^{k}\mathbf{E}^{k-1}...\mathbf{E}^{1}}\right)\det\left({\mathbf{B}}\right)$$

$$\det\left({\mathbf{A}\mathbf{B}}\right) = \det\left({\mathbf{A}}\right).\det\left({\mathbf{B}}\right)$$

as required.