Definition:Markov Chain

Definition
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a sequence of random variables in a countable set $S$.

Let $\map \Pr X$ denote the probability of the random variable $X$.

Let $\sequence {X_n}_{n \mathop \ge 0}$ satisfy the Markov property:


 * $\map \Pr {X_{n + 1} = i_{n + 1} \mid X_0 = i_0, X_1 = i_1, \ldots, X_n = i_n} = \map \Pr {X_{n + 1} = i_{n + 1} \mid X_n = i_n}$

for all $n \ge 0$ and all $i_0, i_1, \ldots, i_{n + 1} \in S$.

Then $\sequence {X_n}_{n \mathop \ge 0}$ is a Markov chain.

Also known as
A Markov chain is also known as a Markov process.