Definition:Markov Chain

From ProofWiki
(Redirected from Definition:Markoff Chain)
Jump to navigation Jump to search

Definition

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a stochastic process over a countable set $S$.


$\sequence {X_n}_{n \mathop \ge 0}$ is a Markov chain if and only if it satisfies the Markov property:

$\condprob {X_{n + 1} = i_{n + 1} } {X_0 = i_0, X_1 = i_1, \ldots, X_n = i_n} = \condprob {X_{n + 1} = i_{n + 1} } {X_n = i_n}$

for all $n \ge 0$ and all $i_0, i_1, \ldots, i_{n + 1} \in S$.


That is, such that the conditional probability of $X_{i + 1}$ is dependent only upon $X_i$ and upon no earlier values of $\sequence {X_n}$.

That is, the state of $\sequence {X_n}$ in the future is unaffected by its history.


State

The state of the Markov chain $\sequence {X_n}_{n \mathop \ge 0}$ for a given $k$ is the value of $X_n$ when $n = k$.


State Space

The set $S$ is called the state space of the Markov chain.


Transition Time

The transition time of a given term $X_i$ is the instant of time corresponding to the index $i$.


Transition Probability

Let $p_{r s}$ denote the probability that if $X_n = r$ then $X_{n + 1} = s$.

Then $p_{r s}$ is called the transition probability from $r$ to $s$.

Hence $p_{r r}$ is the probability that if $X_n = r$, it stays in state $r$.


Transition Matrix

Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a finite set $S$ of cardinality $n$.

The transition probabilities for $\sequence {X_n}$ can be expressed in the form of a square matrix of order $n$ thus:

$\begin {pmatrix} p_{00} & p_{01} & p_{02} & \cdots & p_{0n} \\ p_{10} & p_{11} & p_{12} & \cdots & p_{1n} \\ p_{20} & p_{21} & p_{22} & \cdots & p_{2n} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ p_{n0} & p_{n1} & p_{n2} & \cdots & p_{nn} \end {pmatrix}$

such that:

$\forall j: \ds \sum_{i \mathop = 1}^n p_{ij} = 1$

This square matrix is known as the transition matrix for $\sequence {X_n}$.


Homogeneity

$\sequence {X_n}_{n \mathop \ge 0}$ is homogeneous if and only if $\condprob {X_{n + 1} = j} {X_n = i}$ does not depend on the value of $n$ for all $i, j \in S$.

That is: the transition probabilities are constant.


Also defined as

The pure mathematical definition of a Markov chain is as a sequence with probabilistic properties.

However, a Markov chain is frequently defined in terms of a time series, where each of the indices directly corresponds to a time instant.

Hence one may discuss a Markov chain using the notation $\sequence {X_{t_n} }_{n \mathop \ge 0}$ where $t_0, t_1, \ldots, t_n, \ldots$ are time instants.

The usual situation is that the time instants are equally spaced, but that is not strictly necessary.

Hence, with this definition, the term transition time then refers directly to one of the $t_n$ at which the Markov chain may change state.


Examples

Simplest Markov Chain

The simplest Markov chain $\sequence {X_B}$ is the case where the state space $S$ is a Boolean domain, say $S = \set {0, 1}$.

Its transition probabilities are hence $p_{00}$, $p_{01}$, $p_{10}$ and $p_{11}$.

Hence for example:

$p_{01}$ is the probability that $\sequence {X_B}$ changes from state $0$ to state $1$
$p_{11}$ is the probability that $\sequence {X_B}$ remains in state $1$ if it is already there.


Its transition matrix is of the form:

$\begin {pmatrix} p_{00} & p_{01} \\ p_{10} & p_{11} \end {pmatrix}$

where:

\(\ds p_{00} + p_{01}\) \(=\) \(\ds 1\)
\(\ds p_{10} + p_{11}\) \(=\) \(\ds 1\)


Also known as

A Markov chain is also known by some sources as a Markov process.

However, the latter is properly a stochastic process whose transitions may take place on a continuous timescale, and may also have a continuous domain.


Some sources use the spelling Markoff chain.


Also see

  • Results about Markov chains can be found here.


Source of Name

This entry was named for Andrey Andreyevich Markov.


Historical Note

The concept of the Markov chain was introduced by Andrey Andreyevich Markov in $1906$.


Sources