# Definition:Markov Chain/State Space

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.
The set $S$ is called the state space of the Markov chain.