Determinant of Block Diagonal Matrix

Theorem
Let $\mathbf A$ be a block diagonal matrix of order $n$.

Let $\mathbf A_1,\ldots,\mathbf{A}_k$ be the square matrices on the diagonal, i.e.:
 * $\displaystyle \mathbf A = \begin{bmatrix}

\mathbf A_1 & 0 & \cdots & 0 \\ 0 & \mathbf A_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \mathbf A_k \end{bmatrix}$

Then the determinant $\det \left({\mathbf A}\right)$ of $\mathbf A$ satisfies:
 * $\displaystyle \det \left({\mathbf A}\right) = \prod_{i \mathop = 1}^k \det \left({\mathbf A_i}\right)$

Proof
To prove this fact, we need to prove additional helper propositions.

$\textbf{Claim 1}$

The determinant of the block-diagonal matrix of type $M = \begin{pmatrix} A & 0 \\ 0 & I \\ \end{pmatrix}$ or $M = \begin{pmatrix} I & 0 \\ 0 & A \\ \end{pmatrix}$ equals $\det(A)$.

$\textbf{Proof}$

For this we utilize mathematical induction.

$\textbf{Base case}$ - $k = 1$ and $I = I_k = 1$. Then, $\det(M) = 1\det(A) + 0 + \ldots + 0$, because the first column (or last, depending on the location of $I$ block) has all zeros except the first (or last) element.

$\textbf{Induction step}$

By the same reason - all except one elements of the column are zeros - we have $\det(M_k) = 1\det(M_{k-1}) + 0 + \ldots + 0$, where $M_k$ iis the block matrix discussed above wiith $I_k$ block.

Thus, by induction, $\det(M_k) = \det(M_{k-1}) = \ldots = \det(M_1) = \det(A). \square$

$\textbf{Claim 2}$ about the determinant of upper-triangular block matrix

$M = \begin{pmatrix} A & B \\ 0 & D \\ \end{pmatrix}$, $\det(M) = \det(A)\det(D)$

$\textbf{Proof}$

$M = \begin{pmatrix} A & B \\ 0 & D \\ \end{pmatrix} = \begin{pmatrix} I & 0 \\ 0 & D \\ \end{pmatrix} \begin{pmatrix} I & B \\ 0 & I \\ \end{pmatrix} \begin{pmatrix} A & 0 \\ 0 & I \\ \end{pmatrix}$. Thus, using the properties $\det(AB) = \det(A)\det(B)$, $det(I) = 1$ and $ \det\begin{pmatrix} I & B \\ 0 & I \\ \end{pmatrix} = 1$, because this is just triangular matrix with all ones on the diagonal. So, we get $\det(M) = \det(D)\det(A). \square$

From the above propositions one can see that for $ A = \begin{pmatrix} A_{1} & 0 \\ 0 & A_{2} \\ \end{pmatrix}$, which is the special case of the upper-triangular matrix, $\det(A) = \det(A_{1})\det(A_{2})$. Since $A$ is diagonal, $A_{1}$ and $A_{2}$ are also diagonal and their determinants equal to the product of corresponding diagonal blocks. Thus, $\det(A_{1}) = \det(A_{1,1})\det(A_{1,2})$ and $\det(A_{2}) = \det(A_{2,1})\det(A_{2,2})$, which imply $\det(A) = \det(A_{1,1})\det(A_{1,2})\det(A_{2,1})\det(A_{2,2})$. Following this recursion argument, $\displaystyle \det \left({\mathbf A}\right) = \prod_{i \mathop = 1}^k \det \left({\mathbf A_i}\right)$

Also see

 * Determinant of Diagonal Matrix, a special case of this theorem.