Definition:Big-O Notation

Definition
Big-O notation occurs in a variety of contexts.

Also denoted as
In number theory, sometimes Vinogradov's notation $f \ll g$ is used to mean $f = \mathcal O \left({g}\right)$.

This is clearer for estimates leading to typographically complex error terms.

Some sources use an ordinary $O$:
 * $f = O \left({g}\right)$

Also defined as
The statement $f = \mathcal O \left({g}\right)$ is sometimes defined as:
 * $\displaystyle \exists \alpha \in \R: \alpha \ge 0 : \lim_{x \to \infty} \frac{f \left({x}\right)}{g \left({x}\right)} = \alpha$

But requiring that the limit exists is in many cases too restrictive. Moreover, this definition does not generalize to arbitrary normed vector spaces.

Also see

 * Transitivity of Big-O Estimates