Definition:Convergence in Probability

Definition
Let $\sequence {X_n}_{n \ge 1}$ be a sequence of random variables.

Let $b$ be a real number.

We say that $\sequence {X_n}$ converges in probability to $b$ if:


 * $\displaystyle \lim_{n \to \infty} \map \Pr {\size {X_n - b} < \varepsilon} = 1$

for all real $\varepsilon > 0$.

This is written:


 * $X_n \xrightarrow p b$