Determinant of Matrix Product/Proof 4

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\mathbf A = \sqbrk a_n$ and $\mathbf B = \sqbrk b_n$ be a square matrices of order $n$.

Let $\map \det {\mathbf A}$ be the determinant of $\mathbf A$.

Let $\mathbf A \mathbf B$ be the (conventional) matrix product of $\mathbf A$ and $\mathbf B$.


Then:

$\map \det {\mathbf A \mathbf B} = \map \det {\mathbf A} \map \det {\mathbf B}$


That is, the determinant of the product is equal to the product of the determinants.


Proof

Remember that $\det$ can be interpreted as an alternating multilinear map with respect to the columns. This property is sufficient to prove the theorem as follows... Let $A,B$ be two $n\times n$ matrices (with coefficients in a commutative field $\mathbb K$ like $\mathbb R$ or $\mathbb C$).

Let us denote the vectors of the canonical basis of $\mathbb K^n$ by $\boldsymbol{e}_1,\ldots,\boldsymbol{e}_n$ (where $\boldsymbol{e}_i$ is a column with $1$ at $i$th row, zero elsewhere).
Now, we are able to write the matrix $B$ as a column block matrix : $$ B = \begin{pmatrix}\sum_{s_1=1}^n B_{s_1,1}\boldsymbol{e}_{s_1}&\cdots& \sum_{s_n=1}^n B_{s_n,n}\boldsymbol{e}_{s_n}\end{pmatrix} $$ We can rewrite the product $AB$ as a column-block matrix : $$ AB = \begin{pmatrix}\sum_{s_1=1}^n B_{s_1,1}A\boldsymbol{e}_{s_1}&\cdots& \sum_{s_n=1}^n B_{s_n,n}A\boldsymbol{e}_{s_n}\end{pmatrix} $$ Using linearity with respect to each columns, we get $$ \det(AB) = \sum_{1\leqslant s_1,\ldots,s_n\leqslant n} \left(\prod_{i=1}^n B_{s_i,i}\right)\det\begin{pmatrix}A\boldsymbol{e}_{s_1}&\cdots& A\boldsymbol{e}_{s_n}\end{pmatrix} $$ Now notice that $\det\begin{pmatrix}A\boldsymbol{e}_{s_1}&\cdots& A\boldsymbol{e}_{s_n}\end{pmatrix}$ is zero once two entries are the same (since $\det$ is an alternating map), it means that if for some $k\neq \ell$ we have $A\boldsymbol{e}_{s_k} = A\boldsymbol{e}_{s_\ell}$ , then $\det\begin{pmatrix}A\boldsymbol{e}_{s_1}&\cdots& A\boldsymbol{e}_{s_n}\end{pmatrix} = 0$. Therefore the only nonzero summands are those one the $s_1,\ldots, s_n$ are all distinct. In other words, the "selector" $s$ represents some permutation of the numbers $1,\ldots, n$.

As a result, the determinant of the product can now be expressed as a sum of precisely $n!$ terms using permutations : $$ \det(AB) = \sum_{\sigma \in S_n} \left(\prod_{i=1}^n B_{\sigma(i),i}\right)\det\begin{pmatrix}A\boldsymbol{e}_{\sigma(1)}&\cdots& A\boldsymbol{e}_{\sigma(n)}\end{pmatrix} $$ where $S_n$ denotes the set of the permutations of numbers $1,\ldots, n$. However, the RHS determinant of the above equality corresponds to the determinant of permutated columns of $A$. Whenever we transpose two columns, the determinant is modified by a factor $-1$. Indeed, let us apply some transposition $\tau_{ij}$ to a column-block matrix $\begin{pmatrix}C_1&\cdots&C_n\end{pmatrix}$, by linearity it follows that for $i,j$ entries equal to $C_i+C_j$ : $$\begin{split} 0 = \det\begin{pmatrix}C_1\cdots C_i+C_j\cdots C_j+C_i\cdots C_n\end{pmatrix} &= \det\begin{pmatrix}C_1\cdots C_i\cdots C_j\cdots C_n\end{pmatrix} + \det\begin{pmatrix}C_1\cdots C_i\cdots C_i\cdots C_n\end{pmatrix}\\ &+ \det\begin{pmatrix}C_1\cdots C_j\cdots C_j\cdots C_n\end{pmatrix} + \det\begin{pmatrix}C_1\cdots C_j\cdots C_i\cdots C_n\end{pmatrix}\\ &= \det\begin{pmatrix}C_1\cdots C_i\cdots C_j\cdots C_n\end{pmatrix} + \det\begin{pmatrix}C_1\cdots C_j\cdots C_i\cdots C_n\end{pmatrix} \end{split}$$ Hence, transpose two columns reverse determinant sign : $\det\begin{pmatrix}C_1\cdots C_n\end{pmatrix} = -\det\begin{pmatrix}C_{\tau_{ij}(1)}\cdots C_{\tau_{ij}(n)}\end{pmatrix}$.
Since every permutation $\sigma\in S_n$ can be written as a product of transpositions, i.e. $\sigma= \tau_m\cdots\tau_1$ for some transpositions $\tau_1,\ldots,\tau_m$ it follows that $ \det(C_1\cdots C_n) = -\det(C_{\tau_1(1)}\cdots C_{\tau_1(n)}) = \det(C_{\tau_2\tau_1(1)}\cdots C_{\tau_2\tau_1(n)}) = \cdots = (-1)^m\det(C_{\sigma(1)}\cdots C_{\sigma(n)}) $.
The number $(-1)^m$ is the signature of the permutation $\sigma$ (see article about the signature of permutations) and denoted by $\mathrm{sgn}(\sigma)$.

It remains to apply several transpositions of columns to $A$ to get for any permutation $\sigma$ the equality : $$\det \begin{pmatrix}A\boldsymbol{e}_{\sigma(1)}&\cdots& A\boldsymbol{e}_{\sigma(n)}\end{pmatrix} = \mathrm{sgn}(\sigma)\det\begin{pmatrix}A\boldsymbol{e}_{1}&\cdots& A\boldsymbol{e}_{n}\end{pmatrix} = \mathrm{sgn}(\sigma)\det(A) $$ Since $\det(A)$ is a constant quantity, we can go this factor out of the sum, then write $$\det(AB) = \det(A)\sum_{\sigma\in S_n} \mathrm{sgn}(\sigma)\prod_{i=1}^n B_{\sigma(i),i} $$ But the above sum is exactly the definition of $\det(B)$ using the Leibniz formula, and so $$\det(AB) = \det(A)\det(B) $$ ... as desired. $\square$