Cauchy-Bunyakovsky-Schwarz Inequality

= Inner Product Spaces =

(AKA: Schwartz inequality or Cauchy–Bunyakovsky–Schwarz inequality)

Theorem
Let $$V$$ be an inner-product space over $$\mathbb{K}$$ where $$\mathbb{K} = \R$$ or $$\C$$.

Let $$x, y$$ be vectors in $$V$$.

Then $$\left|{\left \langle {x, y} \right \rangle}\right|^2 \leq \left\|{x}\right\| \times \left\|{y}\right\|$$.

Proof
Let $$\lambda \in \mathbb{K}$$. Since an inner-product is generated by a norm on the underlying normed linear space we may expand as follows:

$$ $$ $$ $$

where $$\lambda^*$$ is the complex conjugate of $$\lambda$$.

(If $$\mathbb{K} = \R$$, then $$\lambda^* = \lambda$$.)

If we let $$\lambda = \left \langle {x, y} \right \rangle \times \left \langle {y, y} \right \rangle^{-1}$$ then we obtain:

$$0 \le \left \langle {x, x} \right \rangle - \left|{\left \langle {x, y} \right \rangle}\right|^2 \times \left \langle {y, y} \right \rangle^{-1} $$

Solving this for $$\left|{\left \langle {x, y} \right \rangle}\right|^2 $$, we see that

$$\left|{\left \langle {x, y} \right \rangle}\right|^2 \le \left \langle {x, x} \right \rangle * \left \langle {y, y} \right \rangle = \left\|{x}\right\| \times \left\|{y}\right\|$$

as desired.

= Cauchy's Inequality =

The special case of the Cauchy-Schwarz Inequality in a Euclidean space is called Cauchy's Inequality. It was Cauchy who first published this result in 1821.

It is usually stated and proved as follows:

Theorem
$$\sum {r_i^2} \sum {s_i^2} \ge \left({\sum {r_i s_i}}\right)^2$$ where all of $$r_i, s_i \in \R$$.

Proof 1
For any $$\lambda \in \mathbb{R}$$, we define $$f: \R \to \R$$ as the function:

$$f \left({\lambda}\right) = \sum {\left({r_i + \lambda s_i}\right)^2}$$.

Now $$f \left({\lambda}\right) \ge 0$$ because it is the sum of squares of real numbers.

Hence $$\forall \lambda \in \R: f \left(\lambda\right) \equiv \sum {r_i^2} + 2 \lambda \sum {r_i s_i} + \lambda^2 \sum {s_i^2} \ge 0$$.

This is a simple quadratic in $$\lambda$$, and we can solve it using Quadratic Equation, where:

$$a \lambda^2 + b \lambda + c = 0: a = \sum {s_i^2}, b = 2 \sum {r_i s_i}, c = \sum {r_i^2}$$

The discriminant of this equation (i.e. $$b^2 - 4 a c$$) is:

$$4 \left({\sum {r_i s_i}}\right)^2 - 4 \sum {r_i^2} \sum {s_i^2}$$

If this were positive, then $$f \left({\lambda}\right) = 0$$ would have two distinct real roots, $$\lambda_1$$ and $$\lambda_2$$, say.

If this were the case, then $$\exists \lambda_n: \lambda_1 < \lambda_n < \lambda_2: f \left({\lambda_n}\right) < 0$$.

But we have $$\forall \lambda \in \mathbb{R}: f \left({\lambda}\right) \ge 0$$.

Thus $$4 \left({\sum {r_i s_i}}\right)^2 - 4 \sum {r_i^2} \sum {s_i^2} < 0$$,

which is the same thing as saying $$\sum {r_i^2} \sum {s_i^2} \ge \left({\sum {r_i s_i}}\right)^2$$.

Proof 2
Let $$w_1, w_2, \ldots, w_n$$ and $$z_1, z_2, \ldots, z_n$$ be arbitrary complex numbers.

Take the Binet-Cauchy Identity:
 * $$\left({\sum_{i=1}^n a_i c_i}\right) \left({\sum_{j=1}^n b_j d_j}\right) = \left({\sum_{i=1}^n a_i d_i}\right) \left({\sum_{j=1}^n b_j c_j}\right) + \sum_{1 \le i < j \le n} \left({a_i b_j - a_j b_i}\right) \left({c_i d_j - c_j d_i}\right)$$

and set $$a_i = w_i, b_j = \overline {z_j}, c_i = \overline {w_i}, d_j = z_j $$.

This gives us:

$$ $$ $$ $$

Hence the result.

= Definite Integrals =

The version for integrals was first stated by Bunyakovsky in 1859, and later rediscovered by Schwarz in 1888.

Theorem
Let $$f$$ and $$g$$ be real functions which are continuous on the closed interval $$\left[{a \,. \, . \, b}\right]$$.

Then $$\left({\int_a^b f \left({t}\right) g \left({t}\right) dt}\right)^2 \le \int_a^b \left({f \left({t}\right)}\right)^2 dt \int_a^b \left({g \left({t}\right)}\right)^2 dt$$.

Proof
$$\forall x: 0 \le \left({x f \left({t}\right) + g \left({t}\right)}\right)^2$$.

$$ $$ $$

where:
 * $$A = \int_a^b \left({f \left({t}\right)}\right)^2 dt$$;
 * $$B = \int_a^b f \left({t}\right) g \left({t}\right) dt$$;
 * $$C = \int_a^b \left({g \left({t}\right)}\right)^2 dt$$.

As the Quadratic Equation $$A x^2 + 2 B x + C$$ is positive for all $$x$$, it follows that (using the same reasoning as in Cauchy's Inequality, that $$B^2 \le 4 A C$$.

Hence the result.