Definition:Markov Chain/Homogeneous

Definition
Let $\left\langle{X_n}\right\rangle_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.

$\left\langle{X_n}\right\rangle_{n \mathop \ge 0}$ is homogeneous $\Pr \left({X_{n+1} = j \mid X_n = i}\right)$ does not depend on the value of $n$ for all $i, j \in S$.