Definition:O Notation

O-notation is a type of order notation, typically used in computer science for comparing 'run-times' of algorithms, or in analysis for comparing growth rates between two growth functions.

Big O-Notation
Given two functions $$f \ $$ and $$g \ $$, the statement
 * $$f = O \left({g}\right) \ $$

is equivalent to the statement:
 * $$\exists \alpha \in \R: \alpha > 0 : \lim_{x \to \infty} \frac{f(x)}{g(x)} = \alpha \ $$.

From the definition of limit, it can be seen that this is also equivalent to:
 * $$\exists c \in \R: c > 0, k \ge 0: \forall n > k, f(n) \le c g(n)$$.

This statement is voiced $$f$$ is big-o of $$g$$ or simply $$f$$ is big-o $$g$$.

Little O-Notation
Given two functions $$f \ $$ and $$g \ $$, the statement
 * $$f = o(g) \ $$

is equivalent to the statement
 * $$\lim_{x \to \infty} \frac{f(x)}{g(x)} = 0 \ $$

This statement is voiced $$f$$ is little-o of $$g$$ or simply $$f$$ is little-o $$g$$.