Definition:Summation
Definition
Let $\struct {S, +}$ be an algebraic structure where the operation $+$ is an operation derived from, or arising from, the addition operation on the natural numbers.
Let $\tuple {a_1, a_2, \ldots, a_n} \in S^n$ be an ordered $n$-tuple in $S$.
Definition by Index
The composite is called the summation of $\tuple {a_1, a_2, \ldots, a_n}$, and is written:
- $\ds \sum_{j \mathop = 1}^n a_j = \paren {a_1 + a_2 + \cdots + a_n}$
Definition by Inequality
The summation of $\tuple {a_1, a_2, \ldots, a_n}$ can be written:
- $\ds \sum_{1 \mathop \le j \mathop \le n} a_j = \paren {a_1 + a_2 + \cdots + a_n}$
Definition by Propositional Function
Let $\map R j$ be a propositional function of $j$.
Then we can write the summation as:
- $\ds \sum_{\map R j} a_j = \text{ The sum of all $a_j$ such that $\map R j$ holds}$.
If more than one propositional function is written under the summation sign, they must all hold.
Iverson's Convention
Let $\ds \sum_{\map R j} a_j$ be the summation over all $a_j$ such that $j$ satisfies $R$.
This can also be expressed:
- $\ds \sum_{j \mathop \in \Z} a_j \sqbrk {\map R j}$
where $\sqbrk {\map R j}$ is Iverson's convention.
Summation over Finite Subset
Let $\struct {G, +}$ be a commutative monoid.
Let $F \subseteq G$ be a finite subset of $G$.
Let $\set {e_1, e_2, \ldots, e_n}$ be a finite enumeration of $F$.
Let $\tuple {e_1, e_2, \ldots, e_n}$ be the ordered tuple formed from the bijection $e: \closedint 1 n \to F$.
The summation over $F$, denoted $\ds \sum_{g \mathop \in F} g$, is defined as the summation over $\tuple{e_1, e_2, \ldots, e_n}$:
- $\ds \sum_{g \mathop \in F} g = \sum_{i \mathop = 1}^n e_i$
Summation over Finite Index
Let $\struct {G, +}$ be a commutative monoid.
Let $\family {g_i}_{i \mathop \in I}$ be an indexed subset of $G$ where the indexing set $I$ is finite.
Let $\set {e_1, e_2, \ldots, e_n}$ be a finite enumeration of $I$.
Let $\tuple {g_{e_1}, g_{e_2}, \ldots, g_{e_n} }$ be the ordered tuple formed from the composite mapping $g \circ e: \closedint 1 n \to G$.
The summation over $I$, denoted $\ds \sum_{i \mathop \in I} g_i$, is defined as the summation over $\tuple {g_{e_1}, g_{e_2}, \ldots, g_{e_n} }$:
- $\ds \sum_{i \mathop \in I} g_i = \sum_{k \mathop = 1}^n g_{e_k}$
Infinite
Let an infinite number of values of $j$ satisfy the propositional function $\map R j$.
Then the precise meaning of $\ds \sum_{\map R j} a_j$ is:
- $\ds \sum_{\map R j} a_j = \paren {\lim_{n \mathop \to \infty} \sum_{\substack {\map R j \\ -n \mathop \le j \mathop < 0}} a_j} + \paren {\lim_{n \mathop \to \infty} \sum_{\substack {\map R j \\ 0 \mathop \le j \mathop \le n} } a_j}$
provided that both limits exist.
If either limit does fail to exist, then the infinite summation does not exist.
Finite
Let the set of values of $j$ which satisfy the propositional function $\map R j$ be finite.
Then the summation $\ds \sum_{\map R j} a_j$ is described as being a finite summation.
Index Variable
Consider the summation, in either of the three forms:
- $\ds \sum_{j \mathop = 1}^n a_j \qquad \sum_{1 \mathop \le j \mathop \le n} a_j \qquad \sum_{\map R j} a_j$
The variable $j$, an example of a bound variable, is known as the index variable of the summation.
Summand
The set of elements $\set {a_j \in S: 1 \le j \le n, \map R j}$ is called the summand.
Vacuous Summation
Take the summation:
- $\ds \sum_{\map \Phi j} a_j$
where $\map \Phi j$ is a propositional function of $j$.
Suppose that there are no values of $j$ for which $\map \Phi j$ is true.
Then $\ds \sum_{\map \Phi j} a_j$ is defined as being $0$.
This summation is called a vacuous summation.
This is because:
- $\forall a: a + 0 = a$
where $a$ is a number.
Hence for all $j$ for which $\map \Phi j$ is false, the sum is unaffected.
This is most frequently seen in the form:
- $\ds \sum_{j \mathop = m}^n a_j = 0$
where $m > n$.
In this case, $j$ can not at the same time be both greater than or equal to $m$ and less than or equal to $n$.
Some sources consider such a treatment as abuse of notation.
Also known as
It is common for the term sum to be used for summation.
However, despite the fact that summation is longer, and a less-aesthetic neologism for sum, the term summation is preferred on $\mathsf{Pr} \infty \mathsf{fWiki}$ for immediate clarity.
Notation
The sign $\sum$ is called the summation sign and sometimes referred to as sigma (as that is its name in Greek).
Also see
- Results about summations can be found here.
Historical Note
The notation $\sum$ for a summation was famously introduced by Joseph Fourier in $1820$:
- Le signe $\ds \sum_{i \mathop = 1}^{i \mathop = \infty}$ indique que l'on doit donner au nombre entier $i$ toutes les valeurs $1, 2, 3, \ldots$, et prendre la somme des termes.
- (The sign $\ds \sum_{i \mathop = 1}^{i \mathop = \infty}$ indicates that one must give to the whole number $i$ all the values $1, 2, 3, \ldots$, and take the sum of the terms.)
- -- 1820: Refroidissement séculaire du globe terrestre (Bulletin des Sciences par la Société Philomathique de Paris Vol. 3, 7: pp. 58 – 70)
However, some sources suggest that it was in fact first introduced by Euler.
Sources
- 1997: Donald E. Knuth: The Art of Computer Programming: Volume 1: Fundamental Algorithms (3rd ed.) ... (previous) ... (next): $\S 1.2.3$: Sums and Products: $(1)$