# Jensen's Inequality (Real Analysis)

This proof is about Jensen's Inequality in real analysis. For other uses, see Jensen's Inequality.

## Theorem

Let $I$ be a real interval.

Let $\phi: I \to \R$ be a convex function.

Let $x_1, x_2, \ldots, x_n \in I$.

Let $\lambda_1, \lambda_2, \ldots, \lambda_n \ge 0$ be real numbers, at least one of which is non-zero.

Then:

$\displaystyle \phi \left({\frac {\sum_{k \mathop = 1}^n \lambda_k x_k} {\sum_{k \mathop = 1}^n \lambda_k} }\right) \le \frac {\sum_{k \mathop = 1}^n \lambda_k \phi \left({x_k}\right) } {\sum_{k \mathop = 1}^n \lambda_k}$

For $\phi$ strictly convex, equality holds if and only if $x_1 = x_2 = \cdots = x_n$.

### Corollary

Let $I$ be a real interval.

Let $\phi: I \to \R$ be a concave function.

Let $x_1, x_2, \ldots, x_n \in I$.

Let $\lambda_1, \lambda_2, \ldots, \lambda_n \ge 0$ be real numbers, at least one of which is non-zero.

Then:

$\displaystyle \map \phi {\frac {\sum_{k \mathop = 1}^n \lambda_k x_k} {\sum_{k \mathop = 1}^n \lambda_k} } \ge \frac {\sum_{k \mathop = 1}^n \lambda_k \map \phi {x_k} } {\sum_{k \mathop = 1}^n \lambda_k}$

For $\phi$ strictly concave, equality holds if and only if $x_1 = x_2 = \cdots = x_n$.

## Proof

The proof proceeds by mathematical induction on $n$.

For all $n \in \N_{> 0}$, let $P \left({n}\right)$ be the proposition:

$\displaystyle \phi \left({\frac {\sum_{k \mathop = 1}^n \lambda_k x_k} {\sum_{k \mathop = 1}^n \lambda_k} }\right) \le \frac {\sum_{k \mathop = 1}^n \lambda_k \phi \left({x_k}\right) } {\sum_{k \mathop = 1}^n \lambda_k}$

$P \left({1}\right)$ is true, as this just says:

$\displaystyle \phi \left(\frac {\lambda_1 x_1} {\lambda_1}\right) \le \frac {\lambda_1 \phi \left({x_1}\right) } {\lambda_1}$
 $\displaystyle \phi \left({\frac {\lambda_1 x_1} {\lambda_1} }\right)$ $=$ $\displaystyle \phi \left({x_1}\right)$ $\displaystyle$ $=$ $\displaystyle \frac {\lambda_1 \phi \left({x_1}\right) } {\lambda_1}$

trivially.

This is our basis for the induction.

### Induction Hypothesis

Now we need to show that, if $P \left({r}\right)$ is true, where $r \ge 2$, then it logically follows that $P \left({r + 1}\right)$ is true.

So this is our induction hypothesis:

$\displaystyle \phi \left({\frac {\sum_{k \mathop = 1}^r \lambda_k x_k} {\sum_{k \mathop = 1}^r \lambda_k} }\right) \le \frac {\sum_{k \mathop = 1}^r \lambda_k \phi \left({x_k}\right) } {\sum_{k \mathop = 1}^r \lambda_k}$

where $\lambda_1, \lambda_2, \ldots, \lambda_r \ge 0$ are real numbers, at least one of which is non-zero.

Then we need to show:

$\displaystyle \phi \left({\frac {\sum_{k \mathop = 1}^{r + 1} \lambda_k x_k} {\sum_{k \mathop = 1}^{r + 1} \lambda_k} }\right) \le \frac {\sum_{k \mathop = 1}^{r + 1} \lambda_k \phi \left({x_k}\right) } {\sum_{k \mathop = 1}^{r + 1} \lambda_k}$

where $\lambda_1, \lambda_2, \ldots, \lambda_{r + 1} \ge 0$ are real numbers, at least one of which is non-zero.

### Induction Step

This is our induction step:

Without loss of generality, assume that $\lambda_r$ and $\lambda_{r + 1}$ are non-zero.

Otherwise, the statement reduces to the induction hypothesis.

Define:

$y := \dfrac {\lambda_r x_r + \lambda_{r + 1} x_{r + 1} } {\lambda_r + \lambda_{r + 1}}$

Then:

 $\displaystyle \phi \left({\frac {\sum_{k \mathop = 1}^{r + 1} \lambda_k x_k} {\sum_{k \mathop = 1}^{r + 1} \lambda_k} }\right)$ $=$ $\displaystyle \phi \left({\frac {\sum_{k \mathop = 1}^{r - 1} \lambda_k x_k + \left({\lambda_r + \lambda_{r + 1} }\right) y} {\sum_{k \mathop = 1}^{r - 1} \lambda_k + \left({\lambda_r + \lambda_{r + 1} }\right)} }\right)$ $\displaystyle$ $\le$ $\displaystyle \frac {\sum_{k \mathop = 1}^{r - 1} \lambda_k \phi \left({x_k}\right) + \left({\lambda_r + \lambda_{r + 1} }\right) \phi \left({y}\right)} {\sum_{k \mathop = 1}^{r - 1} \lambda_k + \left({\lambda_r + \lambda_{r + 1} }\right)}$ by the induction hypothesis $\displaystyle$ $\le$ $\displaystyle \frac {\sum_{k \mathop = 1}^{r + 1} \lambda_k \phi \left({x_k}\right)} {\sum_{k \mathop = 1}^{r + 1} \lambda_k}$ Definition of Convex Real Function

Note that in the case that $\phi$ is strictly convex, equality holds if and only if $x_1 = x_2 = \cdots = x_r = x_{r + 1}$.

So $P \left({r}\right) \implies P \left({r + 1}\right)$ and the result follows by the Principle of Mathematical Induction.

Therefore:

$\displaystyle \forall n \in \N_{>0}: \phi \left({\frac {\sum_{k \mathop = 1}^n \lambda_k x_k} {\sum_{k \mathop = 1}^n \lambda_k} }\right) \le \frac {\sum_{k \mathop = 1}^n \lambda_k \phi \left({x_k}\right) } {\sum_{k \mathop = 1}^n \lambda_k}$

$\blacksquare$

## Source of Name

This entry was named for Johan Ludwig William Valdemar Jensen.