Sample Matrix Independence Test

Theorem
Let $p_1, \ldots, p_n$ be polynomials.

Let real number samples $x_1, \ldots, x_n$ be given.

Define sample matrix


 * $S = \paren {\begin{smallmatrix}

p_1(x_1) & \cdots      & p_n(x_1) \\ \vdots       & \ddots & \vdots \\ p_1(x_n)  & \cdots     & p_n(x_n) \\ \end{smallmatrix} }$

If $S$ is invertible, then $p_1,\ldots,p_n$ are linearly independent.

Corollary
Let $V$ be a vector space of real-valued functions on a set $J$.

Let $f_1, \ldots, f_n$ be functions in $V$.

Let samples $x_1, \ldots, x_n$ from $J$ be given.

Define sample matrix


 * $S = \paren {\begin{smallmatrix}

f_1(x_1) & \cdots      & f_n(x_1) \\ \vdots       & \ddots & \vdots \\ f_1(x_n)  & \cdots     & f_n(x_n) \\ \end{smallmatrix} }$

If $S$ is invertible, then $f_1, \ldots, f_n$ are linearly independent in $V$.

Proof
Let's apply the definition of linear independence.

Assume a linear combination of the polynomials is the zero polynomial:


 * $\displaystyle \sum_{i \mathop = 1}^n c_i \, \map {p_i} x = 0$ for all $x$

Let $\vec c$ have components $c_1, \ldots, c_n$.

For $i = 1, \ldots, n$ replace $x = x_i$ in the linear combination equality.

There are $n$ linear homogeneous algebraic equations, written as:


 * $S \vec c = \vec 0$

Because $S$ is invertible, then $\vec c = \vec 0$.

The polynomials are linearly independent.