User:Kip/Sandbox

Theorem
Let $x\in\Z_{>1}$ be a positive integer greater than one.

Let $A\in\Z_{>0}$ be a positive integer coprime with $m$.

Then:
 * $\displaystyle A^{x}\equiv a \pmod m$

Is an $n^{th}$ root of unity modulo $m$ where:
 * $\displaystyle n=\frac{\phi(m)}{gcd(\phi(m),x)}$

$\phi(m)$ is Euler's Totient function of the modulus and $gcd(\phi(m),x)$ is the greatest common divisor of the totient and the power.

Proof
Let $\alpha=\frac{x}{gcd(\phi(m),x)}\in\Z_{>0}$ be a positive integer.
 * $\displaystyle A^{\alpha \phi(m)}\equiv a^n\equiv 1\,(mod\,m)$

Square Root of Unity Addition Theorem
Let $p\in\Z_{>0}$ be a prime number greater. Let $x,y,z\in\Z_{>1}$ be positive integers greater than one for which:
 * $2gcd(p-1,x)=2gcd(p-1,y)=n gcd(p-1,z)=p-1$

Then
 * $A^x+B^y\ne C^z \pmod m$

Proof

 * $A^{2x}+2A^xB^y+B^{2y}\equiv C^{2z} \pmod m$

Since the binomial sum is:
 * $\displaystyle \sum_{k\mathop =0}^n \binom n k =2^n$
 * $2A^xB^y\equiv -1 \pmod m$
 * $1\pm 2\equiv 0 \pmod m$
 * $mk=m-1$

Or
 * $m=3$

Theorem
Let $X_1,X_2,X_3,\ldots,X_N$ be a sequence of independent identically distributed random variables chosen from a random distribution with density function $f_X(x)$.

Let $X_{(1)},X_{(2)},X_{(3)},\ldots,X_{(N)}$ be these variables in increasing order.

Then the gap between the ordered statistics converges in distribution to Exponential for sufficiently large sampling according to:
 * $\displaystyle N\left(X_{(i+1)}-X_{(i)}\right)\xrightarrow{D} Exp\left(\frac{1}{f\left(X_{(i)}\right)}\right)$ as $N \to \infty$

For $i=1,2,3,\ldots,N-1$ This result allows for easy identification of the shape of the density function from the random variables.

Proof
Given $i$ and $N$, the ordered statistic $X_{(i)}$ has the probability density function:
 * $\displaystyle f_{X_{(i)}}(x|i,N)=\frac{N!}{(i-1)!(N-i)!}F_X(x)^{i-1}(1-F_X(x))^{N-i}f_X(x)$

where $F_X(x)$ is the cumulative distribution function of $X$.

Let $Y_i=N(X_{(i+1)}-X_{(i)})$ be a new independent statistic valid for $i=1,2,3,\ldots,N-1$ which is always positive.

The joint density function of both $X_{(i)}$ and $Y_{i}$ is then
 * $\displaystyle h_{X_{(i)},Y_i}(x,y|i,N)=\frac{(N-1)!}{(i-1)!(N-i-1)!}F_X(x)^{i-1}\left(1-F_X\left(x+\frac y N\right)\right)^{N-i-1}f_X(x)f_X\left(x+\frac y N\right)$

The conditional density function of $Y_{i}$ given $X_{(i)}$ is $\frac{h_{X_{(i)},Y_i}}{f_{X_{(i)}}}$:
 * $\displaystyle f_{Y_i}(y|x=X_{(i)},i,N)=\frac{N-i}{N}\frac{\left(1-F_X\left(x+\frac y N\right)\right)^{N-i-1}}{\left(1-F_X(x)\right)^{N-i}}f_X\left(x+\frac y N\right)$

The conditional cumulative function of $Y_{i}$ given $X_{(i)}$ is:
 * $\displaystyle F_{Y_i}(y|x=X_{(i)},i,N)=1-\left(\frac{1-F_X\left(x+\frac y N\right)}{1-F_X(x)}\right)^{N-i}$

Using Taylor expansion:
 * $\displaystyle F_X\left(x+\frac y N\right)=F_X(x)+f_X(x)\frac y N + O\left(N^{-2}\right)$

produces
 * $\displaystyle F_{Y_i}(y|x=X_{(i)},i,N)=1-\left(1-\frac{f_X(x)y}{N(1-F_X(x))}+ O\left(N^{-2}\right)\right)^{N-i}$

The exponent limit is:
 * $\displaystyle F_{Y_i}(y|x=X_{(i)},i,N)=1-e^{-f_X(x)y\frac{1-\frac i N}{1-F_X(x)}}+ O\left(N^{-1}\right)$

The limit of $F_{Y_i}$ is then
 * $\displaystyle \lim_{N\rightarrow\infty} F_{Y_i}(y|x=X_{(i)},i,N)=1-e^{-f_X(x)y}$