Definition:Matrix Product (Conventional)

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\struct {R, +, \circ}$ be a ring.

Let $\mathbf A = \sqbrk a_{m n}$ be an $m \times n$ matrix over $R$.

Let $\mathbf B = \sqbrk b_{n p}$ be an $n \times p$ matrix over $R$.

Then the matrix product of $\mathbf A$ and $\mathbf B$ is written $\mathbf A \mathbf B$ and is defined as follows.

Let $\mathbf A \mathbf B = \mathbf C = \sqbrk c_{m p}$.


Then:

$\ds \forall i \in \closedint 1 m, j \in \closedint 1 p: c_{i j} = \sum_{k \mathop = 1}^n a_{i k} \circ b_{k j}$


Thus $\sqbrk c_{m p}$ is the $m \times p$ matrix where each entry $c_{i j}$ is built by forming the (ring) product of each entry in the $i$'th row of $\mathbf A$ with the corresponding entry in the $j$'th column of $\mathbf B$ and adding up all those products.


This operation is called matrix multiplication, and $\mathbf C$ is the matrix product of $\mathbf A$ with $\mathbf B$.


It follows that matrix multiplication is defined whenever the first matrix has the same number of columns as the second matrix has rows.


Pre-Multiplication

Let $\mathbf A \mathbf B$ be the product of $\mathbf A$ with $\mathbf B$.

Then $\mathbf B$ is pre-multiplied by $\mathbf A$.


Post-Multiplication

Let $\mathbf A \mathbf B$ be the product of $\mathbf A$ with $\mathbf B$.

Then $\mathbf A$ is post-multiplied by $\mathbf B$.


Using Einstein Summation Convention

The matrix product of $\mathbf A$ and $\mathbf B$ can be expressed using the Einstein summation convention as:

Then:

$c_{i j} := a_{i k} \circ b_{k j}$


The index which appears twice in the expressions on the right hand side is the entry $k$, which is the one summated over.


Conformable Matrices

It needs to be emphasised that matrix product can be defined on $\mathbf A$ and $\mathbf B$ if and only if $\mathbf A$ and $\mathbf B$ are conformable.


That is, if the number of rows of one is equal to the number of columns of the other.


Notation

To denote the matrix product of $\mathbf A$ with $\mathbf B$, the juxtaposition notation is always used:

$\mathbf C = \mathbf A \mathbf B$

We do not use $\mathbf A \times \mathbf B$ or $\mathbf A \cdot \mathbf B$ in this context, because they have specialised meanings.


Also known as

It is believed that some sources refer to this as the Cauchy product after Augustin Louis Cauchy.

Further rumours suggest that Jacques Philippe Marie Binet may also have lent his name to this concept.

However, corroboration has proven difficult to obtain.


Examples

$2 \times 2$ Real Matrices

Let $\mathbf A = \begin {pmatrix} p & q \\ r & s \end {pmatrix}$ and $\mathbf B = \begin {pmatrix} w & x \\ y & z \end {pmatrix}$ be order $2$ square matrices over the real numbers.


Then the matrix product of $\mathbf A$ with $\mathbf B$ is given by:

$\mathbf A \mathbf B = \begin {pmatrix} p w + q y & p x + q z \\ r w + s y & r x + s z \end {pmatrix}$


$3 \times 3$ Matrix-Vector Multiplication Formula

The $3 \times 3$ matrix-vector multiplication formula is an instance of the matrix product operation:

$\mathbf A \mathbf v = \begin{bmatrix}

a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \\ \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} a_{11} x + a_{12} y + a_{13} z \\ a_{21} x + a_{22} y + a_{23} z \\ a_{31} x + a_{32} y + a_{33} z \\ \end{bmatrix}$


Cayley's Motivation

Let there be $3$ Cartesian coordinate systems:

$\tuple {x, y}$, $\tuple {x', y'}$, $\tuple {x, y}$


Let them be connected by:

$\begin {cases} x' = x + y \\ y' = x - y \end {cases}$

and:

$\begin {cases} x = -x' - y' \\ y = -x' + y' \end {cases}$


The relationship between $\tuple {x, y}$ and $\tuple {x, y}$ is given by:

$\begin {cases}

x = -x' - y' = -\paren {x + y} - \paren {x - y} = -2 x \\ y = -x' + y' = -\paren {x + y} + \paren {x - y} = -2 y \end {cases}$


Arthur Cayley devised the compact notation that expressed the changes of coordinate systems by arranging the coefficients in an array:

$\begin {pmatrix} 1 & 1 \\ 1 & -1 \end {pmatrix} \begin {pmatrix} -1 & -1 \\ -1 & 1 \end {pmatrix} = \begin {pmatrix} -2 & 0 \\ 0 & -2 \end {pmatrix}$

As such, he can be considered as having invented matrix multiplication.


Also see

  • Results about (conventional) matrix multiplication can be found here.


Historical Note

This mathematical process defined by the (conventional) matrix product was first introduced by Jacques Philippe Marie Binet.


Linguistic Note

Some older sources use the term matric multiplication instead of matrix multiplication.

Strictly speaking it is more correct, as matric is the adjective formed from the noun matrix, but it is a little old-fashioned and is rarely found nowadays.


Sources