Definition:Absorbing State
Jump to navigation
Jump to search
Definition
Let $\sequence {X_n}_{n \mathop \ge 0}$ be a Markov chain on a state space $S$.
Let $i \in S$ be an element of the state space $S$.
Then $i$ is an absorbing state of $\sequence {X_n}$ if and only if:
- $X_k = i \implies X_{k + 1} = i$
That is, it is an element of $S$ such that if $\sequence {X_n}$ reaches $i$, it stays there.
Also known as
An absorbing state can also be seen referred to as an absorbing barrier.
Also defined as
Some sources define an absorbing state on a random walk only.
Some sources separate the definitions of absorbing state and absorbing barrier, using the former for a Markov chain and the latter for a random walk.
Also see
- Results about absorbing states can be found here.
Sources
- 1989: Ephraim J. Borowski and Jonathan M. Borwein: Dictionary of Mathematics ... (previous) ... (next): absorbing state
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): random walk
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): absorbing state
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): Markov chain
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): random walk
- 2014: Christopher Clapham and James Nicholson: The Concise Oxford Dictionary of Mathematics (5th ed.) ... (previous) ... (next): absorbing state
- 2014: Christopher Clapham and James Nicholson: The Concise Oxford Dictionary of Mathematics (5th ed.) ... (previous) ... (next): random walk
- 2021: Richard Earl and James Nicholson: The Concise Oxford Dictionary of Mathematics (6th ed.) ... (previous) ... (next): absorbing state