Sample Matrix Independence Test

Theorem
Let $V$ be a vector space of real or complex-valued functions on a set $J$.

Let $f_1, \ldots, f_n$ be functions in $V$.

Let samples $x_1, \ldots, x_n$ from $J$ be given.

Define sample matrix


 * $\displaystyle S = \begin{bmatrix}

f_1(x_1)  &  \cdots      & f_n(x_1) \\ \vdots    & \ddots       & \vdots \\ f_1(x_n)  & \cdots       & f_n(x_n) \\ \end{bmatrix}$

Let $S$ be invertible.

Then $f_1, \ldots, f_n$ are linearly independent in $V$.

Proof
Let's apply the definition of linear independence.

Assume a linear combination of the functions $f_1,\ldots,f_n$ is the zero function:

Let $\vec c$ have components $c_1, \ldots, c_n$.

For $i = 1, \ldots, n$ replace $x = x_i$ in (1).

There are $n$ linear homogeneous algebraic equations, written as:


 * $S \vec c = \vec 0$

Because $S$ is invertible, then $\vec c = \vec 0$.

The functions are linearly independent.

Also see

 * Zero Wronskian of Solutions of Homogeneous Linear Second Order ODE iff Linearly Dependent


 * Definition:Linearly Independent Set of Real Vectors


 * Two Linearly Independent Solutions of Homogeneous Linear Second Order ODE generate General Solution


 * Linearly Independent Solutions of y'' - y = 0