Jensen's Inequality (Measure Theory)

Theorem
Let $\left({X, \Sigma, \mu}\right)$ be a measure space.

Let $f: X \to \R$ be a $\mu$-integrable function such that $f \ge 0$ pointwise.

Proof for convex case
First define unit total mass measure $\d\nu(x):=\dfrac{f(x)}{\int_{X}f(s)\d\mu(s) }\d\mu(x)$.

Let $x_{0}:=\int_{X} g(x) \d\mu(x)\in \mathbb{R}$.

By convexity of V, we can find a line that crosses through the point $(x_{0},V(x_{0}))$ and it stays below the graph of V.

Concretely, there exists constants a,b such that


 * $\displaystyle ax_{0}+b=V(x_{0})\text{ and }ax+b\leq V(x),\forall x\in \mathbb{R}$.

So in particular for $x:=g(s)$ for any $s\in \mathbb{R}$ we write


 * $\displaystyle V(g(s))\geq a g(s)+b.$

Using integral monotonicity we find


 * $\displaystyle \int_{X} V(g(s))\d\nu(s)\geq a\int_{X} g(s)d\nu(s)+b=ax_{0}+b=V(x_{0}).$

Therefore, we proved that


 * $\displaystyle \int_{X} V(g(s))\d\nu(s)\geq V\paren{\int_{X} g(s)d\nu(s)}.$

Proof for concave case
The proof is similar by swapping the sign at the first step and thus the rest:


 * $\displaystyle ax_{0}+b=\Lambda(x_{0})\text{ and }ax+b\geq \Lambda(x),\forall x\in \mathbb{R}$.

Also see

 * Jensen's Inequality (Real Analysis), a similar result in the context of real analysis.