Definition:Algorithm

Context

 * Computer Science
 * Mathematical Logic

Definition
An algorithm is a finite set of instructions or rules that defines a sequence of operations for solving a particular problem for all problem instances for some problem set.

The word ultimately derives from the name Muhammad ibn Musa al-Khwarizmi, although almost certainly is the result of linguistic evolution of the archaic term algorism, with which "algorithm" should not (and will probably not) be confused.

An algorithm must have the following properties:

Finiteness
An algorithm must terminate after a finite number of steps.

Definiteness
Each step of an algorithm must be precisely defined.

Inputs
An algorithm has zero or more inputs, which are values supplied either:
 * before the algorithm starts;
 * as the algorithm runs.

These inputs are taken from specified sets of objects.

Outputs
An algorithm has one or more outputs. These are values which are specifically determined by the inputs.

Effectiveness
An algorithm is supposed to be effective.

That is, its operations must be basic enough to be able to be done exactly and in a finite length of time by, for example, somebody using pencil and paper.

Analysis
There are 2 primary methods for analyzing algorithms formally:
 * 1) Correctness
 * 2) Complexity

Correctness
The primary method of validity for algorithms is using a Proof of Correctness.

This is Correctness through verification of the algorithm accomplishing its purpose formally, and terminating in $$k$$ finite steps.

Complexity
Complexity (through typically measures of run-times using Big-O,or Big-Omega, or Big-Theta)

Complexity can be through 3 main types of analysis:


 * 1) Average-Case Analysis (using a distribution of cases to find an average case of the algorithm runtime for analysis)
 * 2) Best-Case Analysis (using the best possible problem instance of the algorithm for analysis)
 * 3) Worst-Case Analysis (using the worst possible problem instance  of the algorithm for analysis)

Why not just execute algorithms on machines to find their runtimes?
A common misconception of algorithms is that analyzing run times of algorithmic behaviour using an execution of a finite state machine is just as effective as finding their complexity to demonstrate one algorithm to be more efficient than another. The final step of algorithm design is to implement the algorithm on a machine and not the first step. Algorithm behaviours are very hard to spot in implementation as opposed to when observing complexity. Other factors such as CPU speeds, and physical factors upon machines distort the true runtime on these machines. To compare their implementation speeds is a poor way to generalize algorithm complexity versus proper analysis.

An example of this is Quicksort. Theoretically runs in quadratic time (in worst-case) but, in implementation it might run in logarithmic time depending on implementation (in best-case).