Definition:Markov Process/Also known as
Jump to navigation
Jump to search
Markov Process: Also known as
A Markov process is also known as a Markov chain.
However, the latter term is usually reserved for a stochastic process whose transition times consist of a countable sequence and whose state space is also countable.
Some sources use the spelling Markoff process.
Sources
![]() | This page may be the result of a refactoring operation. As such, the following source works, along with any process flow, will need to be reviewed. When this has been completed, the citation of that source work (if it is appropriate that it stay on this page) is to be placed above this message, into the usual chronological ordering. If you have access to any of these works, then you are invited to review this list, and make any necessary corrections. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{SourceReview}} from the code. |