Definition:Markov Process
Jump to navigation
Jump to search
Definition
Let $X$ be a continuous random variable which varies over time.
Let $X_t$ denote the value of $X$ at time $t$.
Then $X$ is a Markov process if and only if the future behaviour of $X$ is dependent only upon the value of $X$ at the present time.
![]() | This article is complete as far as it goes, but it could do with expansion. You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by adding this information. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{Expand}} from the code.If you would welcome a second opinion as to whether your work is correct, add a call to {{Proofread}} the page. |
Also known as
A Markov process is also known as a Markov chain.
However, the latter term is usually reserved for a stochastic process whose transition times consist of a countable sequence and whose state space is also countable.
Some sources use the spelling Markoff process.
Also see
- Results about Markov processes can be found here.
Source of Name
This entry was named for Andrey Andreyevich Markov.
Sources
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): Markov chain
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): Markov chain