# Category:Definitions/Markov Chains

This category contains definitions related to Markov Chains.
Related results can be found in Category:Markov Chains.

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a stochastic process over a countable set $S$.

Let $\map \Pr X$ denote the probability of the random variable $X$.

Let $\sequence {X_n}_{n \mathop \ge 0}$ satisfy the Markov property:

$\condprob {X_{n + 1} = i_{n + 1} } {X_0 = i_0, X_1 = i_1, \ldots, X_n = i_n} = \condprob {X_{n + 1} = i_{n + 1} } {X_n = i_n}$

for all $n \ge 0$ and all $i_0, i_1, \ldots, i_{n + 1} \in S$.

That is, such that the conditional probability of $X_{i + 1}$ is dependent only upon $X_i$ and upon no earlier values of $\sequence {X_n}$.

That is, the state of $\sequence {X_n}$ in the future is unaffected by its history.

Then $\sequence {X_n}_{n \mathop \ge 0}$ is a Markov chain.

## Subcategories

This category has only the following subcategory.

## Pages in category "Definitions/Markov Chains"

The following 8 pages are in this category, out of 8 total.