Gram-Schmidt Orthogonalization

Theorem
Let $\struct {V, \innerprod \cdot \cdot}$ be an inner product space.

Let $S = \set {v_n: n \in \N_{>0} }$ be a linearly independent subset of $V$.

Then there exists an orthonormal subset $E = \set {e_n: n \in \N_{>0} }$ of $V$ such that:


 * $\forall k \in \N: \span \set {v_1, \ldots , v_k} = \span \set {e_1 , \ldots ,e_k}$

where $\span$ denotes linear span.

Corollary
The theorem also holds for finite sets $S$:

Proof
For all $k \in \N_{>0}$, define $u_k, e_k \in V$ inductively as:

where $\norm \cdot$ denotes the inner product norm on $V$.

We prove the theorem by induction for $k \in \N_{>0}$.

Basis for the induction
For $k=1$, the sum in the definition of $u_1$ is empty, so $u_1 = v_1$.

Let $\bszero$ denote the zero vector of $V$, which by definition is the identity of $V$.

From Subset of Module Containing Identity is Linearly Dependent, it follows that $\bszero \notin E$, so $v_1 \ne \bszero$.

By, it follows that $\norm {v_1} \ne 0$.

It follows that $e_1 = \dfrac {1}{\norm {u_1} } u_1$ is well-defined.

By, it follows that:
 * $\norm {e_1} = \dfrac {1}{\norm {u_1} } \norm {u_1} = 1$.

Hence, $\set {e_1}$ is an orthonormal subset of $V$.

From Singleton is Linearly Independent, it follows that $\set {e_1}$ is a linear independent set.

From Sufficient Conditions for Basis of Finite Dimensional Vector Space, it follows that $\set {e_1}$ is a basis for $\span \set {e_1}$.

By definition of dimension of vector space, it follows that:


 * $\dim \span \set {e_1} = \dim \span \set {v_1} = 1$

As $e_1 \in \span \set {v_1}$, it follows that $\span \set {e_1} \subseteq \span \set {v_1}$.

From Sufficient Conditions for Basis of Finite Dimensional Vector Space, it follows that $\set {e_1}$ is also a basis for $\span \set {v_1}$.

By definition of basis, it follows that:


 * $\span \set {e_1} = \span \set {v_1}$

Induction hypothesis
Suppose for $k \in \N_{>1}$ that the set $\set {e_1, \ldots ,e_{k-1} }$ is an orthonormal subset of $V$ such that:


 * $\span \set {v_1, \ldots , v_{k-1} } = \span \set {e_1 , \ldots ,e_{k-1} }$

This is our induction hypothesis.

Assuming the hypothesis, we need to show that $\set {e_1, \ldots ,e_{k} }$ is an orthonormal subset of $V$ such that:


 * $\span \set {v_1, \ldots , v_{k} } = \span \set {e_1 , \ldots ,e_{k} }$

Induction step
By the induction hypothesis, it follows that $\norm {e_i} = 1$ for all $i < k$.

By the same hypothesis, $e_i \in \span \set {v_1, \ldots, v_{k-1} }$.

By definition of linear span, it follows that $\ds \sum_{i \mathop= 1}^{k-1} \innerprod {v_k}{e_i} e_i$ can be written as a linear combination of $v_1, \ldots, v_{k-1}$.

As $\set {v_1, \ldots, v_k}$ is linearly independent, it follows that $v_k \ne \ds \sum_{i \mathop= 1}^{k-1} \innerprod {v_k}{e_i} e_i$.

By definition of $u_k$, it follows that $u_k \ne \bszero$.

It follows that $e_k = \dfrac {1}{\norm {u_k} } u_k$ is well-defined.

By, it follows that:
 * $\norm {e_k} = \dfrac {1}{\norm {u_k} } \norm {u_k} = 1$.

By the induction hypothesis, it follows that $\innerprod {e_{n_1} }{e_{n_2} } = 0$ for all $n_1, n_2 < k$ with $n_1 \ne n_2$.

For $n < k$, it follows that:

It follows that $\set {e_1, \ldots, e_k}$ is an orthonormal subset.

From Orthogonal Set is Linearly Independent Set, it follows that $\set {e_1, \ldots, e_k}$ is a linear independent set.

From Sufficient Conditions for Basis of Finite Dimensional Vector Space, it follows that $\set {e_1, \ldots, e_k}$ is a basis for $\span \set {e_1, \ldots, e_k}$.

By definition of dimension of vector space, it follows that:


 * $\dim \span \set {e_1, \ldots, e_k} = \dim \span \set {v_1, \ldots, v_k} = k$

By the induction hypothesis, it follows that $\set {e_1, \ldots, e_{k-1} } \subseteq \span \set {v_1, \ldots, v_k}$.

As $e_k \in \span \set {v_1, \ldots, v_k}$, it follows that $\span \set {e_1, \ldots, e_k} \subseteq \span \set {v_1, \ldots, v_k}$.

From Sufficient Conditions for Basis of Finite Dimensional Vector Space, it follows that $\set {e_1, \ldots, e_k}$ is also a basis for $\span \set {v_1, \ldots, v_k}$.

By definition of basis, it follows that:


 * $\span \set {e_1, \ldots, e_k} = \span \set {v_1, \ldots, v_k}$

It follows by induction that for all $k \in \N_{>0}$, the set $\set {e_1, \ldots ,e_k}$ is an orthonormal subset of $V$ such that:


 * $\span \set {e_1, \ldots, e_k} = \span \set {v_1, \ldots, v_k}$

For all $k_1, k_2 \in \N_{>0}$, we have shown that $\set {e_1 , \ldots ,e_{\max \set{k_1, k_2} } }$ is an orthonormal subset of $V$.

It follows that $\innerprod {e_{k_1} }{e_{k_2} } = 1$ if $k_1 = k_2$, or $\innerprod {e_{k_1} }{e_{k_2} } = 0$ if $k_1 \ne k_2$.

It follows that $E$ itself is an orthonormal subset of $V$.

Also known as
Some texts call this theorem Gram-Schmidt Orthonormalization.

Some texts refer to this theorem as the Gram-Schmidt Orthogonalization Process.