Markov's Inequality

Theorem
Given a measure space $\displaystyle(X, \Sigma, \mu)$ and an $\displaystyle A$-measurable function $\displaystyle f$ where $A \in \Sigma$, then:


 * $\displaystyle \mu(\{x\in A : |f(x)| \geq t\}) \leq \frac{1}{t}\int_{A} |f| \mathrm d\mu$ for any positive $t \in \R$.

Proof
Pick any t and define $B = \{x\in A : |f(x)| \geq t\}$.

Let $\displaystyle 1_{B}$ denote the indicator function of $B$ on $A$.

For any $x\in A$, either $x\in B$ or $x\notin B$.

In the first case, $t 1_{B}(x) = t\cdot 1 = t \leq |f(x)|$.

In the second case, $t 1_{B}(x) = t\cdot 0 = 0 \leq |f(x)|$.

Hence $\forall x \in A$, $t1_{B}(x) \leq |f(x)|$.

By the monotonicity of the Lebesgue integral, $\displaystyle \int_{A} t1_{B} \mathrm d\mu \leq \int_{A} |f|\mathrm d\mu$.

But by the linearity of the Lebesgue integral, $\displaystyle\int_{A} t1_{B}\mathrm d\mu = t\int_{A} 1_{B}\mathrm d\mu = t\mu (B)$.

Hence, dividing through by $t$, we get $\displaystyle \mu(B)\leq \frac{1}{t}\int_{A} |f| \mathrm d\mu$.

Markov's Inequality in Probability
Given a probability space $(\Omega, \Sigma, \Pr)$, Markov's inequality asserts that for a random variable $X$, $\Pr(|X| \geq t) \leq \dfrac{\mathrm E(|X|)}{t}$ for any $t > 0$.

It can then be used to derive the probabilistic form of Chebyshev's inequality.