Definition:Markov Chain/Transition Probability

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a countable set $S$ satisfying the Markov property:

$\condprob {X_{n + 1} = i_{n + 1} } {X_0 = i_0, X_1 = i_1, \ldots, X_n = i_n} = \condprob {X_{n + 1} = i_{n + 1} } {X_n = i_n}$

for all $n \ge 0$ and all $i_0, i_1, \ldots, i_{n + 1} \in S$.


Let $p_{r s}$ denote the probability that if $X_n = r$ then $X_{n + 1} = s$.

Then $p_{r s}$ is called the transition probability from $r$ to $s$.

Hence $p_{r r}$ is the probability that if $X_n = r$, it stays in state $r$.


Sources