Only 1 by 1 Matrices Generally Commute Under Multiplication

 It has been suggested that this page or section be merged into Matrix Multiplication is Not Commutative. (Discuss)

Theorem

Let $R$ be a commutative ring.

Let $\mathcal M_{m \times n}$ be the set of all $m \times n$ matrices over $R$.

Then under conventional matrix multiplication, $\mathbf {AB} = \mathbf {BA}$ for all $\mathbf A \in \mathcal M_{m \times n}, \; \mathbf B \in \mathcal M_{n \times p}$ if and only if $p = m = n = 1$.

Corollary

Conventional matrix multiplication is in general not commutative, regardless of what the dimensions for each matrix are, unless they're both 1 by 1, in which case the matrix product boils down to the ring product.

Proof

Consider $p \ne m$, then $\mathbf {AB}$ is defined, but $\mathbf {BA}$ is not, so

$p \ne m \implies \mathbf {AB} \ne \mathbf {BA}$

Now consider $p = m$, and $m \ne n$. Then $\mathbf {AB}$ would be an $m \times m$ matrix, while $\mathbf {BA}$ would be an $n \times n$ matrix. $p \ne m \implies \mathbf {AB} \ne \mathbf {BA}$ regardless of whether $m = n$ is true or not, so it is easily seen that

$p \ne m \lor m \ne n \implies \mathbf {AB} \ne \mathbf {BA}$

Now consider $p = m$, $m = n$, and $n \ne 1$, then $\mathbf {AB} \ne \mathbf {BA}$. This part will be proved by mathematical induction.

Bases for the Induction

From Matrix Multiplication is Not Commutative, it is seen that there exist 2 by 2 matrices that don't commute under matrix multiplication, thus proving the result for $n = 2$.

Induction Hypothesis

Assume that there exist 2 $n \times n$ matrices $\mathbf A$ and $\mathbf B$ such that $\mathbf {AB} \ne \mathbf {BA}$

Induction Step

For any square matrix of order $n$, call it $\mathbf D$, let $\mathbf {D'}$ be a square matrix of order $n+1$ defined as

$D'_{ij} = \begin{cases} D_{ij} & :i \lt n+1 \land j \lt n+1 \\ 0 & :i = n+1 \lor j = n+1 \end{cases}$

Basically, $\mathbf {D'}$ is just $\mathbf D$ with a zero row and zero column added at the ends; $\mathbf D$ is a submatrix of $\mathbf D'$.

$\left(A'B'\right)_{ij} = \begin{cases} \displaystyle \sum_{r \mathop = 1}^{n+1} A'_{ir} B'_{rj} & :i \lt n+1 \land j \lt n+1 \\ 0 & :i = n+1 \lor j = n+1 \end{cases}$

But

$\displaystyle \sum_{r \mathop = 1}^{n+1} A'_{ir} B'_{rj} = A'_{i\left({n+1}\right)} B'_{\left({n+1}\right)i} + \sum_{r \mathop = 1}^{n} A'_{ir} B'_{rj} = \sum_{r \mathop = 1}^n A_{ir} B_{rj}$

and so

$\mathbf {A'B'} \left({n+1; n+1}\right) = \left({\mathbf {AB}}\right) \mathbf ' \left({n+1; n+1}\right) = \mathbf {AB} \ne \mathbf {BA} = \left({\mathbf {BA}}\right) \mathbf ' \left({n+1; n+1}\right) = \mathbf {B'A'} \left({n+1; n+1}\right)$

By Matrix Equivalence Implies Submatrix Equivalence, and utilizing the Rule of Transposition, it is seen that

$\exists \mathbf {A'}, \mathbf {B'} \in \mathcal M_{n+1 \times n+1}: \mathbf {A'B'} \ne \mathbf {B'A'}$

$p \ne m \lor m \ne n \implies \mathbf {AB} \ne \mathbf {BA}$ regardless of whether $n = 1$ is true or not, so it is easily seen that

$p \ne m \lor m \ne n \lor n \ne 1 \implies \exists \mathbf A \in \mathcal M_{m \times n}, \mathbf B \in \mathcal M_{n \times p}: \mathbf {AB} \ne \mathbf {BA} \dashv \vdash \\ \left(\forall \mathbf A \in \mathcal M_{m \times n}, \mathbf B \in \mathcal M_{n \times p}: \mathbf {AB} = \mathbf {BA}\right) \implies \lnot \left({p \ne m \lor m \ne n \lor n \ne 1}\right)$

by Rule of Transposition. The existential quantifier is needed here because there exist square matrices that commute under matrix multiplication, e.g. the unit matrix.

$\lnot \left({p \ne m \lor m \ne n \lor n \ne 1}\right) \dashv \vdash p = m \land m = n \land n = 1$

By a generalization of De Morgan's Law.

And finally, by the properties of an equivalence relation:

$\left(\forall \mathbf A \in \mathcal M_{m \times n}, \mathbf B \in \mathcal M_{n \times p}: \mathbf {AB} = \mathbf {BA}\right) \implies p = m = n = 1 \qquad (1)$

Now to prove the converse:

Let $\mathbf A \in \mathcal M_{m \times n}$, $\mathbf B \in \mathcal M_{n \times p}$, and suppose that $p = m = n = 1$, then because $R$ is a commutative ring:

$\mathbf {AB} = A_{11} B_{11} = B_{11} A_{11} = \mathbf {BA}$

and so

$p = m = n = 1 \implies \left(\forall \mathbf A \in \mathcal M_{m \times n}, \mathbf B \in \mathcal M_{n \times p}: \mathbf {AB} = \mathbf {BA}\right) \qquad (2)$

Combining (1) and (2) yields the result.

$\blacksquare$