User:Kip/Sandbox

Theorem
Let $x\in\Z_{>1}$ be a positive integer greater than one.

Let $A\in\Z_{>0}$ be a positive integer coprime with $m$.

Then:
 * $\displaystyle A^{x}\equiv a \pmod m$

Is an $n^{th}$ root of unity modulo $m$ where:
 * $\displaystyle n=\frac{\phi(m)}{gcd(\phi(m),x)}$

$\phi(m)$ is Euler's Totient function of the modulus and $gcd(\phi(m),x)$ is the greatest common divisor of the totient and the power.

Proof
Let $\alpha=\frac{x}{gcd(\phi(m),x)}\in\Z_{>0}$ be a positive integer.
 * $\displaystyle A^{\alpha \phi(m)}\equiv a^n\equiv 1\,(mod\,m)$

Square Root of Unity Addition Theorem
Let $p\in\Z_{>0}$ be a prime number greater. Let $x,y,z\in\Z_{>1}$ be positive integers greater than one for which:
 * $2gcd(p-1,x)=2gcd(p-1,y)=n gcd(p-1,z)=p-1$

Then
 * $A^x+B^y\ne C^z \pmod m$

Proof

 * $A^{2x}+2A^xB^y+B^{2y}\equiv C^{2z} \pmod m$

Since the binomial sum is:
 * $\displaystyle \sum_{k\mathop =0}^n \binom n k =2^n$
 * $2A^xB^y\equiv -1 \pmod m$
 * $1\pm 2\equiv 0 \pmod m$
 * $mk=m-1$

Or
 * $m=3$

Theorem
Let $X$ be a continuous random variable with the exponential distribution with parameter $\beta$.

Then the expectation of $X$ is given by:
 * $E \left({X}\right) = \beta$

Definition
Let $X$ be a continuous random variable.

The expectation of $X$ is written $E \left({X}\right)$, and is defined as:
 * $\displaystyle E \left({X}\right) := \int_{0}^{1} x\, \mathrm{d}\Pr \left({X < x}\right)$

whenever the integral is absolutely convergent, i.e. when:
 * $\displaystyle \int_{0}^{1}\left| x\, \mathrm{d}\Pr \left({X < x}\right)\right| < \infty$

where $\Pr \left({X < x}\right)$ is the cumulative probability function of $X$.

Also, from the definition of probability density function, we see it can also be written:
 * $\displaystyle E \left({X}\right) := \int_{x \in \Omega_X} x\, f\left(x\right)\mathrm{d}x$

Also known as
The expectation of $X$ is also called the expected value of $X$ or the mean of $X$, and (for a given continuous random variable) is often denoted $\mu$.

The terminology is appropriate, as it can be seen that an expectation is an example of a normalized weighted mean. This follows from the fact that a probability density function is a normalized weight function.

Also see
It can also be seen that the expectation of a continuous random variable is its first moment.