Spacing Limit Theorem

Theorem
Let $X_{\left({i}\right)}$ be the $i$th ordered statistic of $N$ samples from a continuous random distribution with density function $f_X \left({x}\right)$.

Then the spacing between the ordered statistics given $X_{\left({i}\right)}$ converges in distribution to exponential for sufficiently large sampling according to:


 * $N \left({X_{\left({i + 1}\right)} - X_{\left({i}\right)} }\right) \xrightarrow D \exp \left({ \dfrac 1 { f \left({ X_{\left({i}\right)} }\right)} }\right)$

as $N \to \infty$ for $i = 1, 2, 3, \dotsc, N-1$.

Proof
Given $i$ and $N$, the ordered statistic $X_{\left({i}\right)}$ has the probability density function:
 * $f_{X_{\left({i}\right)} } \left({x \mid i, N}\right) = \dfrac {N!} {\left({i - 1}\right)! \left({N - i}\right)!} F_X \left({x}\right)^{i-1} \left({1 - F_X \left({x}\right)}\right)^{N - i} f_X \left({x}\right)$

where $F_X \left({x}\right)$ is the cumulative distribution function of $X$.

Let $Y_i = N \left({X_{\left({i + 1}\right)} - X_{\left({i}\right)} }\right)$ be an independent spacing variable valid for $i = 1, 2, 3, \dotsc, N - 1$ which is always positive.

The joint density function of both $X_{\left({i}\right)}$ and $Y_{i}$ is then:
 * $f_{X_{\left({i}\right)}, Y_i} \left({x, y \mid i, N}\right) = \dfrac {\left({N - 1}\right)!} {\left({i - 1}\right)! \left({N - i - 1}\right)!} F_X \left({x}\right)^{i - 1} \left({1 - F_X \left({x + \dfrac y N}\right)}\right)^{N - i - 1} f_X \left({x}\right) f_X \left({x + \dfrac y N}\right)$

The conditional density function of $Y_i$ given $X_{\left({i}\right)}$ is:
 * $f_{Y_i} = \dfrac {f_{X_{\left({i}\right)}, Y_i} } {f_{X_{\left({i}\right)} } }$

which turns into:
 * $f_{Y_i} \left({y \mid x = X_{\left({i}\right)}, i, N}\right) = \dfrac {N-i} N \dfrac {\left({1 - F_X \left({x + \dfrac y N}\right)}\right)^{N - i - 1} } {\left({1 - F_X \left({x}\right)}\right)^{N - i} } f_X \left({x + \dfrac y N}\right)$

The conditional cumulative function of $Y_i$ given $X_{\left({i}\right)}$ is:
 * $F_{Y_i} \left({y \mid x = X_{\left({i}\right)}, i, N}\right) = 1 - \left({\dfrac {1 - F_X \left({x + \dfrac y N}\right)} {1 - F_X \left({x}\right)} }\right)^{N - i}$

The following Taylor expansion in $y$ is an approximation of $F_X \left({x + \dfrac y N}\right)$:
 * $F_X \left({x + \dfrac y N}\right) = F_X \left({x}\right) + f_X \left({x}\right) \dfrac y N + O \left({N^{-2} }\right)$

Inserting this produces:
 * $F_{Y_i} \left({y \mid x = X_{\left({i}\right)}, i, N}\right) = 1 - \left({1 - \dfrac {f_X \left({x}\right) y} {N \left({1 - F_X \left({x}\right)}\right)} + O \left({N^{-2} }\right)}\right)^{N - i}$

The limit as $N$ gets large is the exponential function:
 * $F_{Y_i} \left({y \mid x = X_{\left({i}\right)}, i, N}\right) = 1 -e^{-f_X \left({x}\right) y \dfrac {1 - \dfrac i N} {1 - F_X \left({x}\right)} } + O \left({N^{-1} }\right)$

The distribution of $F_X$ is uniform by definition of a random pick:
 * $F_X \left({x}\right) \sim U \left({0, 1}\right)$

Therefore $i$ is uniformly distributed as well:
 * $F_X \left({X_{\left({i}\right)} }\right) \approx \dfrac i N$

The limit of $F_{Y_i}$ is then:
 * $\displaystyle \lim_{N \to \infty} F_{Y_i} \left({y \mid x = X_{\left({i}\right)}, i, N}\right) = 1 - e^{-f_X \left({x}\right) y}$