Definition:Markov Chain/Transition Matrix

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a finite set $S$ of cardinality $n$.

The transition probabilities for $\sequence {X_n}$ can be expressed in the form of a square matrix of order $n$ thus:

$\begin {pmatrix} p_{00} & p_{01} & p_{02} & \cdots & p_{0n} \\ p_{10} & p_{11} & p_{12} & \cdots & p_{1n} \\ p_{20} & p_{21} & p_{22} & \cdots & p_{2n} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ p_{n0} & p_{n1} & p_{n2} & \cdots & p_{nn} \end {pmatrix}$

such that:

$\forall j: \ds \sum_{i \mathop = 1}^n p_{ij} = 1$

This square matrix is known as the transition matrix for $\sequence {X_n}$.


Sources