Sample Matrix Independence Test

From ProofWiki
Jump to navigation Jump to search


Theorem

Let $V$ be a vector space of real or complex-valued functions on a set $J$.

Let $f_1, \ldots, f_n$ be functions in $V$.

Let samples $x_1, \ldots, x_n$ from $J$ be given.

Define sample matrix

$\displaystyle S = \begin{bmatrix} f_1(x_1) & \cdots & f_n(x_1) \\ \vdots & \ddots & \vdots \\ f_1(x_n) & \cdots & f_n(x_n) \\ \end{bmatrix}$

Let $S$ be invertible.

Then $f_1, \ldots, f_n$ are linearly independent in $V$.


Proof

Let's apply the definition of linear independence.

Assume a linear combination of the functions $f_1,\ldots,f_n$ is the zero function:

\((1):\quad\) \(\displaystyle \displaystyle \sum_{i \mathop = 1}^n c_i \, \map {f_i} x\) \(=\) \(\displaystyle 0\) for all $x$

Let $\vec c$ have components $c_1, \ldots, c_n$.

For $i = 1, \ldots, n$ replace $x = x_i$ in (1).

There are $n$ linear homogeneous algebraic equations, written as:

$S \vec c = \vec 0$

Because $S$ is invertible, then $\vec c = \vec 0$.

The functions are linearly independent.

$\blacksquare$

Examples

Example: Linearly Independent Solutions of $y'' - y = 0$

Prove independence of the solutions $e^x$, $e^{-x}$ to:

$\displaystyle y'' - y = 0$


Example: Linear Independence of Powers $1,x,\ldots,x^{n-1}$

Let $V$ be the vector space of all continuous functions on $\R$.

Let $n \ge 1$ be an integer and define:

\(\displaystyle S\) \(=\) \(\displaystyle \set {1, x, \ldots, x^{n-1} }\)

Prove that $S$ is a linearly independent subset of $V$.


Also see