Category:Markov Chains
This category contains results about Markov Chains.
Definitions specific to this category can be found in Definitions/Markov Chains.
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a stochastic process over a countable set $S$.
Let $\map \Pr X$ denote the probability of the random variable $X$.
Let $\sequence {X_n}_{n \mathop \ge 0}$ satisfy the Markov property:
- $\condprob {X_{n + 1} = i_{n + 1} } {X_0 = i_0, X_1 = i_1, \ldots, X_n = i_n} = \condprob {X_{n + 1} = i_{n + 1} } {X_n = i_n}$
for all $n \ge 0$ and all $i_0, i_1, \ldots, i_{n + 1} \in S$.
That is, such that the conditional probability of $X_{i + 1}$ is dependent only upon $X_i$ and upon no earlier values of $\sequence {X_n}$.
That is, the state of $\sequence {X_n}$ in the future is unaffected by its history.
Then $\sequence {X_n}_{n \mathop \ge 0}$ is a Markov chain.
Pages in category "Markov Chains"
This category contains only the following page.