Definition:Random Walk/One Dimension
Jump to navigation
Jump to search
Definition
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain whose state space is the set of integers $\Z$.
Let $\sequence {X_n}$ be such that $X_{n + 1}$ is an element of the set $\set {X_n + 1, X_n, X_n - 1}$.
Then $\sequence {X_n}$ is a one-dimensional random walk.
Examples
Gambling Game
Consider the gambling game in which a player with initial capital $k$ wins $1$ unit with probability $p$ and loses $1$ unit with probability $1 - p$.
This is a one-dimensional random walk with absorbing state $k = 0$.
Also known as
A one-dimensional random walk is also known as a simple random walk.
It can also be seen unhyphenated: one dimensional random walk.
Also see
- Results about one-dimensional random walks can be found here.
Sources
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): random walk
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): random walk
- 2014: Christopher Clapham and James Nicholson: The Concise Oxford Dictionary of Mathematics (5th ed.) ... (previous) ... (next): random walk