# Definition:Relative Matrix of Linear Transformation

## Definition

Let $\struct {R, +, \circ}$ be a ring with unity.

Let $G$ be a free $R$-module of finite dimension $n>0$

Let $H$ be a free $R$-module of finite dimension $m>0$

Let $\sequence {a_n}$ be an ordered basis of $G$.

Let $\sequence {b_m}$ be an ordered basis of $H$.

Let $u : G \to H$ be a linear transformation.

The **matrix of $u$ relative to $\sequence {a_n}$ and $\sequence {b_m}$** is the $m \times n$ matrix $\sqbrk \alpha_{m n}$ where:

- $\ds \forall \tuple {i, j} \in \closedint 1 m \times \closedint 1 n: \map u {a_j} = \sum_{i \mathop = 1}^m \alpha_{i j} \circ b_i$

That is, the matrix whose columns are the coordinate vectors of the image of the basis elements of $\AA$ relative to the basis $\BB$.

The matrix of such a linear transformation $u$ relative to the ordered bases $\sequence {a_n}$ and $\sequence {b_m}$ is denoted:

- $\sqbrk {u; \sequence {b_m}, \sequence {a_n} }$

## Also denoted as

If $u$ is an automorphism on an $n$-dimensional module $G$, we can write $\sqbrk {u; \sequence {b_m}, \sequence {a_n} }$ as $\sqbrk {u; \sequence {a_n} }$

Other notations include:

- $M_{u, B, A}$
- $\map { {M_B}^A} u$
- $\map {M_{B, A} } u$

## Warning

Consider the **matrix of $u$ relative to $\sequence {a_n}$ and $\sequence {b_m}$**:

- $\sqbrk {u; \sequence {b_m}, \sequence {a_n} }$

Note the order in which the bases are presented in this expression $\sqbrk {u; \sequence {b_m}, \sequence {a_n} }$.

The indication of the ordered basis for the domain, that is $\sequence {a_n}$, is given **after** that of the codomain, that is $\sequence {b_m}$.

Thus, the entries in the $j$th column of $\sqbrk {u; \sequence {b_m}, \sequence {a_n} }$ are the scalars occurring in the expression of $\map u {a_j}$ as a linear combination of the sequence $\tuple {b_1, \ldots, b_m}$.

A motivation for this choice is the intuitive cancellation law in Change of Coordinate Vectors Under Linear Mapping.

## Examples

### Differentiation

Let $\map {P_3} \R$ and $\map {P_4} \R$ denote the sets of real polynomials of degree $3$ and $4$ respectively.

Let $\sequence {I^J}_{0 \mathop \le j < n}$ denote the ordered bases of $\map {P_3} \R$ and $\map {P_4} \R$ where $n = 3$ and $n = 4$ respectively.

Let $D: p \to D p$ denote the operation of differentiation on a real polynomial $p$.

Let $\mathbf D := \sqbrk {D; \sequence {I^J}_{0 \mathop \le j < 3}, \sequence {I^J}_{0 \mathop \le j < 4} }$ denote the relative matrix of the linear transformation that is $D$.

Then:

- $\mathbf D = \begin {bmatrix} 0 & 1 & 0 & 0 \\ 0 & 0 & 2 & 0 \\ 0 & 0 & 0 & 3 \end {bmatrix}$

## Also see

- Definition:Change of Basis Matrix
- Linear Transformation as Matrix Product
- Matrix Product as Linear Transformation

- Results about
**relative matrices of linear transformations**can be found here.

## Sources

- 1965: Seth Warner:
*Modern Algebra*... (previous) ... (next): Chapter $\text {V}$: Vector Spaces: $\S 29$. Matrices