Definition:Markov Process

From ProofWiki
Jump to navigation Jump to search

Definition

Let $X$ be a continuous random variable which varies over time.

Let $X_t$ denote the value of $X$ at time $t$.


Then $X$ is a Markov process if and only if the future behaviour of $X$ is dependent only upon the value of $X$ at the present time.




Also known as

A Markov process is also known as a Markov chain.

However, the latter term is usually reserved for a stochastic process whose transition times consist of a countable sequence and whose state space is also countable.


Some sources use the spelling Markoff process.


Also see

  • Results about Markov processes can be found here.


Source of Name

This entry was named for Andrey Andreyevich Markov.


Sources