Definition:Big-O Notation

Definition
Given two functions $f$ and $g$, the statement
 * $f = \mathcal O \left({g}\right)$

is equivalent to the statement:
 * $\displaystyle \exists \alpha \in \R: \alpha \geq 0 : \lim_{x \to \infty} \frac{f(x)}{g(x)} = \alpha$.

From the definition of the limit of a function, it can be seen that this is also equivalent to:
 * $\exists c \in \R: c > 0, k \ge 0: \forall n > k, f(n) \le c g(n) \qquad (1)$

For some fixed $k$ (appropriate to the function under consideration) the infimum of such $c$ is called the implied constant.

This statement is voiced $f$ is big-O of $g$ or simply $f$ is big-O $g$.

In number theory, sometimes the notation $f \ll g$ is used to mean $f = \mathcal O \left({g}\right)$. This is clearer for estimates leading to typographically complex error terms.

Notation
Sometimes an ordinary $O$ is used: $f = O \left({g}\right)$.