Definition:Big-O Notation

Definition
Big-O notation occurs in a variety of contexts.

Also defined as
Some authors require that the inequality $\Vert f(x) \Vert \leq c \cdot \Vert g(x) \Vert$ be valid in a specified domain (usually, the domain of definition) instead of an unspecified neighborhood. That is, the Big-O estimate is by definition uniform (on some specified domain).

The statement $f = \mathcal O \left({g}\right)$ is sometimes defined as:
 * $\displaystyle \exists \alpha \in \R: \alpha \ge 0 : \lim_{x \to \infty} \frac{f \left({x}\right)}{g \left({x}\right)} = \alpha$

But requiring that the limit exists is in many cases too restrictive. Moreover, this definition does not generalize to arbitrary normed vector spaces.

Also see

 * Definition:Little-O Notation
 * Transitivity of Big-O Estimates