Definition:Markov Chain/State Space
< Definition:Markov Chain(Redirected from Definition:State Space of Markov Chain)
Jump to navigation
Jump to search
Definition
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.
The set $S$ is called the state space of the Markov chain.
Sources
- 2014: Geoffrey Grimmett and Dominic Welsh: Probability: An Introduction (2nd ed.): $\S 12$: Markov chains