Definition:Markov Chain/Also known as

From ProofWiki
Jump to navigation Jump to search

Markov Chain: Also known as

A Markov chain is also known by some sources as a Markov process.

However, the latter is properly a stochastic process whose transitions may take place on a continuous timescale, and may also have a continuous domain.


Some sources use the spelling Markoff chain.


Sources