Chapman-Kolmogorov Equation

From ProofWiki
Jump to: navigation, search

Theorem

Let $X$ be a discrete state-space Markov Chain with with $n$-step transition probability matrix:

$\mathbf P^{\left({n} \right)} = \left[{p^{\left({n}\right)} \left({j, k}\right)}\right]_{j, k \mathop \in S}$

where:

$p^{\left({n}\right)} \left({j, k}\right) = \mathbb P \left[{X_{m + n} = k \mid X_m = j}\right] = {p_{j k} }^{\left({n}\right)}$ is the $n$-step transition probability.


Then:

$\mathbf P^{\left({n + m}\right)} = \mathbf P^{\left({n}\right)} \mathbf P^{\left({m}\right)}$

or equivalently:

$\displaystyle {p_{i j} }^{\left({n + m}\right)} = \sum_{k \mathop \in S} {p_{i k} }^{\left({n}\right)} {p_{k j} }^{\left({m}\right)}$


Proof

We consider the conditional probability on the left hand side:

\(\displaystyle \) \(\) \(\displaystyle \mathbb P \left[{X_{m+n} = j \mid X_0 = i}\right]\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \mathbb P \left[{\left({\bigcup_{k \mathop \in S} \left[{X_{n+m} = j, X_n = k}\right]}\right)} \ \middle \vert \ {X_0 = i}\right]\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop \in S} \mathbb P \left[{X_{n+m} = j, X_n = k \mid X_0 = i}\right]\) $\quad$ by countable additivity $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop \in S} \mathbb P \left[{X_{n+m} = j \mid X_n = k, X_0 = i}\right] \times \mathbb P \left[{X_n = k \mid X_0 = i}\right]\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop \in S} \mathbb P \left[{X_{n+m} = j \mid X_n = k}\right] \times \mathbb P \left[{X_n = k \mid X_0 = i}\right]\) $\quad$ by the Markov Property $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop \in S} {p_{i k} }^{\left({n}\right)} {p_{k j} }^{\left({m}\right)}\) $\quad$ by definition of the stationary transition probabilities $\quad$

$\blacksquare$


Source of Name

This entry was named for Sydney Chapman and Andrey Nikolaevich Kolmogorov.