Definition:Markov Chain/State

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.


The state of the Markov chain $\sequence {X_n}_{n \mathop \ge 0}$ for a given $k$ is the value of $X_n$ when $n = k$.


Sources