Definition:Markov Chain/State Space

Definition
Let $\left\langle{X_n}\right\rangle_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.

The set $S$ is called the state space of the Markov chain.