Definition:Algorithm/Analysis/Complexity

Definition
The complexity of an algorithm is defined as the number of discrete steps needed to terminate as a function of the number of inputs.

An algorithm may be analysed in the context of its complexity.

This is typically via measures of run-times using big-$\OO$, or big-$\Omega$, or $\Theta$ notation.

Complexity can be through 3 main types of analysis:


 * 1) Average-Case Analysis, using a distribution of cases to find an average case of the algorithm run-time for analysis
 * 2) Best-Case Analysis, using the best possible problem instance of the algorithm for analysis
 * 3) Worst-Case Analysis,using the worst possible problem instance of the algorithm for analysis