Category:Definitions/Markov Chains
Jump to navigation
Jump to search
This category contains definitions related to Markov Chains.
Related results can be found in Category:Markov Chains.
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a stochastic process over a countable set $S$.
$\sequence {X_n}_{n \mathop \ge 0}$ is a Markov chain if and only if it satisfies the Markov property:
- $\condprob {X_{n + 1} = i_{n + 1} } {X_0 = i_0, X_1 = i_1, \ldots, X_n = i_n} = \condprob {X_{n + 1} = i_{n + 1} } {X_n = i_n}$
for all $n \ge 0$ and all $i_0, i_1, \ldots, i_{n + 1} \in S$.
That is, such that the conditional probability of $X_{i + 1}$ is dependent only upon $X_i$ and upon no earlier values of $\sequence {X_n}$.
That is, the state of $\sequence {X_n}$ in the future is unaffected by its history.
Subcategories
This category has the following 3 subcategories, out of 3 total.
Pages in category "Definitions/Markov Chains"
The following 19 pages are in this category, out of 19 total.
M
- Definition:Markoff Chain
- Definition:Markov Chain
- Definition:Markov Chain/Also defined as
- Definition:Markov Chain/Also known as
- Definition:Markov Chain/Homogeneous
- Definition:Markov Chain/State
- Definition:Markov Chain/State Space
- Definition:Markov Chain/Transition Matrix
- Definition:Markov Chain/Transition Probability
- Definition:Markov Chain/Transition Time