Definition:O Notation

Definition
O-notation is a type of order notation, typically used in computer science for comparing 'run-times' of algorithms, or in analysis for comparing growth rates between two growth functions.

Big O-Notation
Given two functions $f$ and $g$, the statement
 * $f = O \left({g}\right)$

is equivalent to the statement:
 * $\displaystyle \exists \alpha \in \R: \alpha \geq 0 : \lim_{x \to \infty} \frac{f(x)}{g(x)} = \alpha$.

From the definition of the limit of a function, it can be seen that this is also equivalent to:
 * $\exists c \in \R: c > 0, k \ge 0: \forall n > k, f(n) \le c g(n) \qquad (1)$.

For some fixed $k$ (appropriate to the function under consideration) the infimum of such $c$ is called the implied constant.

This statement is voiced $f$ is big-o of $g$ or simply $f$ is big-o $g$.

In number theory, sometimes the notation $f \ll g$ is used to mean $f = O \left({g}\right)$. This is clearer for estimates leading to typographically complex error terms.

Little O-Notation
Given two functions $f$ and $g$, the statement
 * $f = o(g)$

is equivalent to the statement
 * $\displaystyle\lim_{x \to \infty} \frac{f(x)}{g(x)} = 0$

This statement is voiced $f$ is little-o of $g$ or simply $f$ is little-o $g$.