Definition:Convergence in Probability

Definition
Let $\sequence {X_n}_{n \mathop \ge 1}$ be a sequence of random variables.

Let $X$ be a random variable.

We say that $\sequence {X_n}$ converges in probability to $X$ :


 * $\ds \forall \epsilon \in \R_{>0}: \lim_{n \mathop \to \infty} \map \Pr {\size {X_n - X} < \epsilon} = 1$

This is written:


 * $X_n \xrightarrow p X$