Definition:Computational Method/Computational Sequence
Jump to navigation
Jump to search
Definition
Consider the computational method $\left({Q, I, \Omega, f}\right)$ in which:
- $Q$ represents the set of states of the computation
- $I$ represents the input to the computation
- $\Omega$ represents the output from the computation
- $f: Q \to Q$ represents the computational rule
Each $x \in I$ defines a computational sequence $x_0, x_1, x_2, \ldots$ as follows:
- $x_0 = x$
- $\forall k \ge 0: x_{k+1} = f \left({x_k}\right)$
Termination
A computational sequence $x_0, x_1, x_2, \ldots$ is said to terminate in $k$ steps if $k$ is the smallest integer for which $x_k \in \Omega$.
In this case, it produces the output $x_k$ from $x$.
Sources
- 1997: Donald E. Knuth: The Art of Computer Programming: Volume 1: Fundamental Algorithms (3rd ed.) ... (previous) ... (next): $\S 1.1$: Algorithms