Category:Definitions/Random Walks
Jump to navigation
Jump to search
This category contains definitions related to Random Walks.
Related results can be found in Category:Random Walks.
One-Dimensional Random Walk
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain whose state space is the set of integers $\Z$.
Let $\sequence {X_n}$ be such that $X_{n + 1}$ is an element of the set $\set {X_n + 1, X_n, X_n - 1}$.
Then $\sequence {X_n}$ is a one-dimensional random walk.
Subcategories
This category has the following 2 subcategories, out of 2 total.
A
- Definitions/Absorbing States (4 P)
Pages in category "Definitions/Random Walks"
The following 4 pages are in this category, out of 4 total.