Definition:O Notation

From ProofWiki
Jump to navigation Jump to search

Definition

O notation is a type of order notation, typically used in computer science for comparing 'run-times' of algorithms, or in analysis for comparing growth rates between two growth functions.


Big-O Notation

Let $f$ and $g$ be real-valued or complex-valued functions defined on a neighborhood of $+ \infty$ in $\R$.


The statement:

$\map f x = \map {\mathcal O} {\map g x}$ as $x \to \infty$

is equivalent to:

$\exists c \in \R: c \ge 0: \exists x_0 \in \R: \forall x \in \R: \paren {x \ge x_0 \implies \size {\map f x} \le c \cdot \size {\map g x} }$


That is:

$\size {\map f x} \le c \cdot \size {\map g x}$

for $x$ sufficiently large.


This statement is voiced $f$ is big-O of $g$ or simply $f$ is big-O $g$.


Little-O Notation

Let $g(x)\neq0$ for $x$ sufficiently large.


$f$ is little-o of $g$ as $x \to \infty$ if and only if:

$\displaystyle \lim_{x \to \infty} \ \frac{f \left({x}\right)} {g \left({x}\right)} = 0$