Definition:Random Walk
Jump to navigation
Jump to search
This page is about random walk in the context of probability theory. For other uses, see walk.
Definition
One-Dimensional Random Walk
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain whose state space is the set of integers $\Z$.
Let $\sequence {X_n}$ be such that $X_{n + 1}$ is an element of the set $\set {X_n + 1, X_n, X_n - 1}$.
Then $\sequence {X_n}$ is a one-dimensional random walk.