Definition:Markov Process/Also known as

From ProofWiki
Jump to navigation Jump to search

Markov Process: Also known as

A Markov process is also known as a Markov chain.

However, the latter term is usually reserved for a stochastic process whose transition times consist of a countable sequence and whose state space is also countable.


Some sources use the spelling Markoff process.


Sources