Definition:Markov Chain/State
< Definition:Markov Chain(Redirected from Definition:State of Markov Chain)
Jump to navigation
Jump to search
Definition
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$.
The state of the Markov chain $\sequence {X_n}_{n \mathop \ge 0}$ for a given $k$ is the value of $X_n$ when $n = k$.
Sources
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): Markov chain
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): Markov chain