Definition:Markov Chain/Also known as
Jump to navigation
Jump to search
Markov Chain: Also known as
A Markov chain is also known by some sources as a Markov process.
However, the latter is properly a stochastic process whose transitions may take place on a continuous timescale, and may also have a continuous domain.
Some sources use the spelling Markoff chain.
Sources
- 1989: Ephraim J. Borowski and Jonathan M. Borwein: Dictionary of Mathematics ... (previous) ... (next): chain: 3. Markov Chain.