Product Rule for Divergence

Theorem
Let $\map {\mathbf V} {x_1, x_2, \ldots, x_n}$ be a vector space of $n$ dimensions.

Let $\mathbf A$ be a vector field over $\mathbf V$.

Let $U$ be a scalar field over $\mathbf V$.

Then:
 * $\map {\operatorname {div} } {U \mathbf A} = \map U {\operatorname {div} \mathbf A} + \mathbf A \cdot \grad U$

where
 * $\operatorname {div}$ denotes the divergence operator
 * $\grad$ denotes the gradient operator
 * $\cdot$ denotes dot product.

Proof
From Divergence Operator on Vector Space is Dot Product of Del Operator and definition of the gradient operator:

where $\nabla$ denotes the del operator.

Hence we are to demonstrate that:
 * $\nabla \cdot \paren {U \, \mathbf A} = \map U {\nabla \cdot \mathbf A} + \paren {\nabla U} \cdot \mathbf A$

Let $\mathbf A$ be expressed as a vector-valued function on $\mathbf V$:


 * $\mathbf A := \tuple {\map {A_1} {\mathbf r}, \map {A_2} {\mathbf r}, \ldots, \map {A_n} {\mathbf r} }$

where $\mathbf r = \tuple {x_1, x_2, \ldots, x_n}$ is an arbitrary element of $\mathbf V$.

Let $\tuple {\mathbf e_1, \mathbf e_2, \ldots, \mathbf e_n}$ be the standard ordered basis of $\mathbf V$.

Then:

Also presented as
This result can also be presented as:


 * $\nabla \cdot \paren {U \, \mathbf A} = \map U {\nabla \cdot \mathbf A} + \paren {\nabla U} \cdot \mathbf A$

presupposing the implementations of $\operatorname {div}$ and $\grad$ as operations using the del operator.