Definition:Markov Chain

Definition
Let $\left\langle{X_n}\right\rangle_{n \mathop \ge 0}$ be a sequence of random variables in a countable set $S$.

Let $\Pr \left({X}\right)$ denote the probability of the random variable $X$.

Let $\left\langle{X_n}\right\rangle_{n \mathop \ge 0}$ satisfy the Markov property:


 * $\Pr \left({X_{n+1} = i_{n+1} \mid X_0 = i_0, X_1 = i_1, \ldots, X_n = i_n}\right) = \Pr \left({X_{n+1} = i_{n+1} \mid X_n = i_n}\right)$

for all $n \ge 0$ and all $i_0, i_1, \ldots, i_{n+1} \in S$.

Then $\left\langle{X_n}\right \rangle_{n \mathop \ge 0}$ is a Markov chain.