Determinant of Matrix Product

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\mathbf A = \sqbrk a_n$ and $\mathbf B = \sqbrk b_n$ be a square matrices of order $n$.

Let $\map \det {\mathbf A}$ be the determinant of $\mathbf A$.

Let $\mathbf A \mathbf B$ be the (conventional) matrix product of $\mathbf A$ and $\mathbf B$.


Then:

$\map \det {\mathbf A \mathbf B} = \map \det {\mathbf A} \map \det {\mathbf B}$


That is, the determinant of the product is equal to the product of the determinants.


General Case

Let $\mathbf {A_1}, \mathbf{A_2}, \cdots, \mathbf{A_n}$ be square matrices of order $n$, where $n > 1$.


Then:

$\det \left({\mathbf {A_1} \mathbf {A_2} \cdots \mathbf {A_n}}\right) = \det \left({\mathbf {A_1}}\right) \det \left({\mathbf {A_2}}\right) \cdots \det \left({\mathbf {A_n}}\right)$


Proof 1

This proof assumes that $\mathbf A$ and $\mathbf B$ are $n \times n$-matrices over a commutative ring with unity $\left({R, +, \circ}\right)$.


Let $\mathbf C = \left[{c}\right]_n = \mathbf A \mathbf B$.

From Square Matrix is Row Equivalent to Triangular Matrix, it follows that $\mathbf A$ can be converted into a upper triangular matrix $\mathbf A'$ by a finite sequence of elementary row operations $\hat o_1, \ldots, \hat o_{m'}$.

Let $\mathbf C'$ denote the matrix that results from using $\hat o_1, \ldots, \hat o_{m'}$ on $\mathbf C$.

From Elementary Row Operations Commute with Matrix Multiplication, it follows that $\mathbf C' = \mathbf A' \mathbf B$.

Effect of Sequence of Elementary Row Operations on Determinant shows that there exists $\alpha \in R$ such that:

$\alpha \det \left({\mathbf A'}\right) = \det \left({\mathbf A}\right)$
$\alpha \det \left({\mathbf C'}\right) = \det \left({\mathbf C}\right)$


Let $\mathbf B^\intercal$ be the transpose of $B$.

From Transpose of Matrix Product, it follows that:

$\left({\mathbf C'}\right)^\intercal = \left({\mathbf A' \mathbf B}\right)^\intercal = \mathbf B^\intercal \left({\mathbf A'}\right)^\intercal$

From Square Matrix is Row Equivalent to Triangular Matrix, it follows that $\mathbf B^\intercal$ can be converted into a lower triangular matrix $\left({\mathbf B^\intercal}\right)'$ by a finite sequence of elementary row operations $\hat p_1, \ldots, \hat p_{m''}$.

Let $\mathbf C''$ denote the matrix that results from using $\hat p_1, \ldots, \hat p_{m''}$ on $\left({\mathbf C'}\right)^\intercal$.

From Elementary Row Operations Commute with Matrix Multiplication, it follows that:

$\mathbf C'' = \left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal$

Effect of Sequence of Elementary Row Operations on Determinant shows that there exists $\beta \in R$ such that:

$\beta \det \left({\left({\mathbf B^\intercal}\right)'}\right) = \det \left({\mathbf B^\intercal}\right)$
$\beta \det \left({\mathbf C''}\right) = \det \left({ \left({\mathbf C'}\right)^\intercal }\right)$


From Transpose of Upper Triangular Matrix is Lower Triangular, it follows that $\left({\mathbf A'}\right)^\intercal$ is a lower triangular matrix.

Then Product of Triangular Matrices shows that $\left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal$ is a lower triangular matrix whose diagonal elements are the products of the diagonal elements of $\left({\mathbf B^\intercal}\right)'$ and $\left({\mathbf A'}\right)^\intercal$.

From Determinant of Triangular Matrix, we have that $\det \left({\left({\mathbf A'}\right)^\intercal}\right)$, $\det \left({\left({\mathbf B^\intercal}\right)' }\right)$, and $\det \left({\left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal }\right)$ are equal to the product of their diagonal elements.

Combinining these results shows that:

$\det \left({\left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal}\right) = \det \left({\left({\mathbf B^\intercal}\right)'}\right) \det \left({\left({\mathbf A'}\right)^\intercal }\right)$


Then:

\(\displaystyle \det \left({\mathbf C}\right)\) \(=\) \(\displaystyle \alpha \det \left({\mathbf C'}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \alpha \det \left({ \left({\mathbf C'}\right)^\intercal}\right)\) Determinant of Transpose
\(\displaystyle \) \(=\) \(\displaystyle \alpha \beta \det \left({\mathbf C''}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \alpha \beta \det \left({ \left({\mathbf B^\intercal}\right)' \left({\mathbf A'}\right)^\intercal}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \alpha \beta \det \left({\left({\mathbf B^\intercal}\right)' }\right) \det \left({\left({\mathbf A'}\right)^\intercal}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \alpha \det \left({\left({\mathbf A'}\right)^\intercal}\right) \beta \det \left({\left({\mathbf B^\intercal}\right)' }\right)\) Commutativity of Ring Product in $R$
\(\displaystyle \) \(=\) \(\displaystyle \alpha \det \left({\mathbf A'}\right) \det \left({\mathbf B^\intercal}\right)\) Determinant of Transpose
\(\displaystyle \) \(=\) \(\displaystyle \det \left({\mathbf A}\right) \det \left({\mathbf B}\right)\) Determinant of Transpose

$\blacksquare$


Proof 2

Consider two cases:

$(1): \quad \mathbf A$ is not invertible.
$(2): \quad \mathbf A$ is invertible.


Proof of case $1$

Assume $\mathbf A$ is not invertible.

Then:

$\map \det {\mathbf A} = 0$

Also if $\mathbf A$ is not invertible then neither is $\mathbf A \mathbf B$.

Indeed, if $\mathbf A \mathbf B$ has an inverse $\mathbf C$, then $\mathbf A \mathbf B \mathbf C = \mathbf I$, whereby $\mathbf B \mathbf C$ is a right inverse of $\mathbf A$.

It follows by Left or Right Inverse of Matrix is Inverse that in that case $\mathbf B \mathbf C$ is the inverse of $A$.

It follows that:

$\map \det {\mathbf A \mathbf B} = 0$


Thus:

$0 = 0 \cdot \map \det {\mathbf B}$
$\map \det {\mathbf A \mathbf B} = \map \det {\mathbf A} \cdot \map \det {\mathbf B}$

$\Box$


Proof of case $2$

Assume $\mathbf A$ is invertible.

Then $\mathbf A$ is a product of [[elementary row matrices, $\mathbf E$.

Let $\mathbf A = \mathbf E_k \mathbf E_{k - 1} \cdots \mathbf E_1$.

So:

$\map \det {\mathbf A \mathbf B} = \map \det {\mathbf E_k \mathbf E_{k - 1} \cdots \mathbf E_1 \mathbf B}$


It remains to be shown that for any square matrix $\mathbf D$ of order $n$:

$\map \det {\mathbf E \mathbf D} = \map \det {\mathbf E} \cdot \map \det {\mathbf D}$


Let $e_i \paren {\mathbf I} = \mathbf E_i$ for all $i \in \closedint 1 k$, then using Elementary Row Operations as Matrix Multiplications and Effect of Sequence of Elementary Row Operations on Determinant yields:

$\map \det {\mathbf E \mathbf D} = \map \det {\mathbf E_k \mathbf E_{k - 1} \dotsm \mathbf {E_1} \mathbf D} = \map \det {e_k e_{k - 1} \cdots e_1 \paren {\mathbf D} } = \alpha \map \det {\mathbf D}$


Using Elementary Row Operations as Matrix Multiplications and Effect of Sequence of Elementary Row Operations on Determinant, and Unit Matrix is Unity of Ring of Square Matrices:

$\map \det {\mathbf E} = \map \det {\mathbf E_k \mathbf E_{k - 1} \cdots \mathbf {E_1} \mathbf I} = \map \det {e_k e_{k - 1} \cdots e_1 \paren {\mathbf I} } = \alpha \map \det {\mathbf I}$


From Determinant of Unit Matrix:

$\map \det {\mathbf E} = \alpha$


And so $\map \det {\mathbf E \mathbf D} = \map \det {\mathbf E} \cdot \map \det {\mathbf D}$

$\Box$


Therefore:

$\map \det {\mathbf A \mathbf B} = \map \det {\mathbf A} \map \det {\mathbf B}$

as required.

$\blacksquare$


Proof 3

The Cauchy-Binet Formula gives:

$\displaystyle \det \left({\mathbf A \mathbf B}\right) = \sum_{1 \mathop \le j_1 \mathop < j_2 \mathop < \cdots \mathop < j_m \le n} \det \left({\mathbf A_{j_1 j_2 \ldots j_m}}\right) \det \left({\mathbf B_{j_1 j_2 \ldots j_m}}\right)$

where:

$\mathbf A$ is an $m \times n$ matrix
$\mathbf B$ is an $n \times m$ matrix.
For $1 \le j_1, j_2, \ldots, j_m \le n$:
$\mathbf A_{j_1 j_2 \ldots j_m}$ denotes the $m \times m$ matrix consisting of columns $j_1, j_2, \ldots, j_m$ of $\mathbf A$.
$\mathbf B_{j_1 j_2 \ldots j_m}$ denotes the $m \times m$ matrix consisting of rows $j_1, j_2, \ldots, j_m$ of $\mathbf B$.

When $m = n$, the only set $j_1, j_2, \ldots, j_m$ that fulfils $1 \le j_1 < j_2 < \cdots < j_m \le n$ is $\left\{ {1, 2, \ldots, n}\right\}$.

Hence the result.

$\blacksquare$


Proof 4

Remember that $\det$ can be interpreted as an alternating multilinear map with respect to the columns.

This property is sufficient to prove the theorem as follows.

Let $\mathbf A, \mathbf B$ be two $n \times n$ matrices (with coefficients in a commutative field $\mathbb K$ like $\mathbb R$ or $\mathbb C$).

Let us denote the vectors of the canonical basis of $\mathbb K^n$ by $\mathbf e_1, \ldots, \mathbf e_n$ (where $\mathbf e_i$ is a column with $1$ at $i$th row, zero elsewhere).

Now, we are able to write the matrix $\mathbf B$ as a column block matrix :

$\mathbf B = \begin {pmatrix} \displaystyle \sum_{s_1 \mathop = 1}^n \mathbf B_{s_1, 1} \mathbf e_{s_1} & \cdots & \displaystyle \sum_{s_n \mathop = 1}^n \mathbf B_{s_n, n} \mathbf e_{s_n} \end {pmatrix}$

We can rewrite the product $\mathbf A \mathbf B$ as a column-block matrix :

$\mathbf A \mathbf B = \begin {pmatrix} \displaystyle \sum_{s_1 \mathop = 1}^n \mathbf B_{s_1, 1} \mathbf A \mathbf e_{s_1} & \cdots & \displaystyle \sum_{s_n \mathop = 1}^n B_{s_n, n} \mathbf A \mathbf e_{s_n} \end {pmatrix} $

Using linearity with respect to each columns, we get:

$\map \det {\mathbf A \mathbf B} = \displaystyle \sum_{1 \mathop \leqslant s_1, \ldots, s_n \mathop \leqslant n} \paren {\prod_{i \mathop = 1}^n \mathbf B_{s_i, i} } \det \begin {pmatrix} \mathbf A \mathbf e_{s_1} & \cdots & \mathbf A \mathbf e_{s_n} \end {pmatrix}$

Now notice that $\det \begin {pmatrix} \mathbf A \mathbf e_{s_1} & \cdots & \mathbf A \mathbf e_{s_n} \end{pmatrix}$ is zero once two entries are the same (since $\det$ is an alternating map), it means that if for some $k \ne \ell$ we have $\mathbf A \mathbf e_{s_k} = \mathbf A \mathbf e_{s_\ell}$, then $\det \begin {pmatrix} \mathbf A \mathbf e_{s_1} & \cdots & \mathbf A \mathbf e_{s_n} \end {pmatrix} = 0$.

Therefore the only nonzero summands are those one the $s_1, \ldots, s_n$ are all distinct.

In other words, the "selector" $s$ represents some permutation of the numbers $1, \ldots, n$.


As a result, the determinant of the product can now be expressed as a sum of precisely $n!$ terms using permutations:

$\map \det {\mathbf A \mathbf B} = \displaystyle \sum_{\sigma \in S_n} \paren {\prod_{i \mathop = 1}^n B_{\map \sigma i, i} } \det \begin {pmatrix} \mathbf A \mathbf e_{\map \sigma 1} & \cdots & \mathbf A \mathbf e_{\map \sigma n} \end {pmatrix}$

where $S_n$ denotes the set of the permutations of numbers $1, \ldots, n$.

However, the right hand side determinant of the above equality corresponds to the determinant of permutated columns of $\mathbf A$.

Whenever we transpose two columns, the determinant is modified by a factor $-1$.

Indeed, let us apply some transposition $\tau_{i j}$ to a column-block matrix $\begin {pmatrix} \mathbf C_1 & \cdots & \mathbf C_n \end {pmatrix}$.

By linearity it follows that for $i, j$ entries equal to $\mathbf C_i + \mathbf C_j$:

\(\displaystyle 0\) \(=\) \(\displaystyle \det \begin {pmatrix} \mathbf C_1 \cdots \mathbf C_i + \mathbf C_j \cdots \mathbf C_j + \mathbf C_i \cdots \mathbf C_n \end {pmatrix}\)
\(\displaystyle \) \(=\) \(\displaystyle \det \begin {pmatrix} \mathbf C_1 \cdots \mathbf C_i \cdots \mathbf C_j \cdots \mathbf C_n \end {pmatrix} + \det \begin {pmatrix} \mathbf C_1 \cdots \mathbf C_i \cdots \mathbf C_i \cdots \mathbf C_n \end {pmatrix}\)
\(\displaystyle \) \(\) \(\, \displaystyle + \, \) \(\displaystyle \det \begin {pmatrix} \mathbf C_1 \cdots \mathbf C_j \cdots \mathbf C_j \cdots \mathbf C_n \end {pmatrix} + \det \begin {pmatrix} \mathbf C_1 \cdots \mathbf C_j \cdots \mathbf C_i \cdots \mathbf C_n \end {pmatrix}\)
\(\displaystyle \) \(=\) \(\displaystyle \det \begin {pmatrix} \mathbf C_1 \cdots \mathbf C_i \cdots \mathbf C_j \cdots \mathbf C_n \end {pmatrix} + \det \begin {pmatrix} \mathbf C_1 \cdots \mathbf C_j \cdots \mathbf C_i \cdots \mathbf C_n \end {pmatrix}\)


Hence, transpose two columns reverse determinant sign:

$\det \begin {pmatrix} \mathbf C_1 \cdots \mathbf C_n \end {pmatrix} = -\det \begin {pmatrix} \mathbf C_{\map {\tau_{i j} } 1} \cdots \mathbf C_{\map {\tau_{i j} } n} \end {pmatrix}$

Since every permutation $\sigma \in S_n$ can be written as a product of transpositions, that is:

$\sigma = \tau_m \cdots \tau_1$

for some transpositions $\tau_1, \ldots, \tau_m$, it follows that:

\(\displaystyle \map \det {\mathbf C_1 \cdots \mathbf C_n}\) \(=\) \(\displaystyle -\map \det {\mathbf C_{\map {\tau_1} 1} \cdots \mathbf C_{\map {\tau_1} n} }\)
\(\displaystyle \) \(=\) \(\displaystyle \map \det {\mathbf C_{\map {\tau_2 \tau_1} 1} \cdots \mathbf C_{\map {\tau_2 \tau_1} n} }\)
\(\displaystyle \) \(=\) \(\displaystyle \cdots\)
\(\displaystyle \) \(=\) \(\displaystyle \paren {-1}^m \map \det {\mathbf C_{\map \sigma 1} \cdots \mathbf C_{\map \sigma n} }\)

The number $\paren {-1}^m$ is the signature of the permutation $\sigma$ (see article about the signature of permutations) and denoted by $\map \sgn \sigma$.


It remains to apply several transpositions of columns to $\mathbf A$ to get for any permutation $\sigma$ the equality :

$\det \begin {pmatrix} \mathbf A \mathbf e_{\map \sigma 1} & \cdots & \mathbf A \mathbf e_{\map \sigma n} \end {pmatrix} = \map \sgn \sigma \det \begin {pmatrix} \mathbf A \mathbf e_1 & \cdots & \mathbf A \mathbf e_n \end {pmatrix} = \map \sgn \sigma \map \det {\mathbf A}$

Since $\map \det {\mathbf A}$ is a constant quantity, we can go this factor out of the sum, then write:

$\displaystyle \map \det {\mathbf A \mathbf B} = \map \det {\mathbf A} \sum_{\sigma \mathop \in S_n} \map \sgn \sigma \prod_{i \mathop = 1}^n \mathbf B_{\map \sigma i, i}$

But the above sum is exactly the definition of $\map \det {\mathbf B}$ using the Leibniz formula, and so:

$\map \det {\mathbf A \mathbf B} = \map \det {\mathbf A} \map \det {\mathbf B}$

Hence the result.

$\blacksquare$


Sources