Definition:Divergence Operator

Definition
Let $\mathbf V \left({x_1, x_2, \ldots, x_n}\right)$ be a vector space of $n$ dimensions.

Let $\left({\mathbf e_1, \mathbf e_2, \ldots, \mathbf e_n}\right)$ be the standard ordered basis of $\mathbf V$.

Let $\mathbf f \left({f_1, f_2, \ldots, f_n}\right): \mathbf V \to \mathbf V$ be a vector-valued function on $\mathbf V$.

The divergence of $\mathbf f$ is defined as:

{{eqn | r = \left({\sum_{k \mathop = 1}^n \frac {\partial f_k} {\partial x_k}     | c = }}

Thus the divergence is a scalar in $\mathbf V$.

Real Cartesian Space
The usual context in which the gradient operator is encountered is real Cartesian space:

Also known as
The divergence of $\mathbf f$ is usually vocalised div $\mathbf f$.