Definition:Markov Chain/State Space

Definition
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.

The set $S$ is called the state space of the Markov chain.