Sample Matrix Independence Test

From ProofWiki
Jump to navigation Jump to search







Theorem

Let $V$ be a vector space of real or complex-valued functions on a set $J$.

Let $f_1, \ldots, f_n$ be functions in $V$.

Let samples $x_1, \ldots, x_n$ from $J$ be given.

Define the sample matrix :

$S = \begin{bmatrix}

\map {f_1} {x_1} & \cdots & \map {f_n} {x_1} \\ \vdots & \ddots & \vdots \\ \map {f_1} {x_n} & \cdots & \map {f_n} {x_n} \\ \end{bmatrix}$

Let $S$ be invertible.

Then $f_1, \ldots, f_n$ are linearly independent in $V$.


Proof

The definition of linear independence is applied.

Assume a linear combination of the functions $f_1, \ldots, f_n$ is the zero function:

\(\text {(1)}: \quad\) \(\ds \sum_{i \mathop = 1}^n c_i \map {f_i} x\) \(=\) \(\ds 0\) for all $x$

Let $\vec c$ have components $c_1, \ldots, c_n$.

For $i = 1, \ldots, n$ replace $x = x_i$ in $(1)$.

There are $n$ linear homogeneous algebraic equations, written as:

$S \vec c = \vec 0$

Because $S$ is invertible:

$\vec c = \vec 0$

The functions are linearly independent.

$\blacksquare$


Examples

Example: Linearly Independent Solutions of $y - y = 0$

Prove independence of the solutions $e^x$, $e^{-x}$ to:

$y - y = 0$


Example: Linear Independence of Powers $1, x, \ldots, x^{n - 1}$

Let $V$ be the vector space of all continuous functions on $\R$.

Let $n \ge 1$ be an integer and define:

$S = \set {1, x, \ldots, x^{n - 1} }$

$S$ is a linearly independent subset of $V$.


Also see