Inverse of Vandermonde Matrix/Proof 2

From ProofWiki
Jump to navigation Jump to search

Proof

Definition 1

$V_n = \begin{bmatrix} x_1 & \cdots & x_n \\ x_1^2 & \cdots & x_n^2 \\ \vdots & \ddots & \vdots \\ x_1^{n} & \cdots & x_n^{n} \\ \end{bmatrix} ,\quad V = \begin{bmatrix} 1 & \cdots & 1 \\ x_1 & \cdots & x_n \\ \vdots & \ddots & \vdots \\ x_1^{n-1} & \cdots & x_n^{n-1} \\ \end{bmatrix} \quad $ Vandermonde matrices
$D = \begin{bmatrix} x_1 & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & x_n \\ \end{bmatrix}, \quad P = \begin{bmatrix} \map {p_1} {x_1} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & \map {p_n} {x_n} \\ \end{bmatrix} \quad $ Definition:Diagonal Matrix
$ E = \begin{bmatrix} E_{11} & \cdots & E_{1n} \\ \vdots & \ddots & \vdots \\ E_{n1} & \cdots & E_{nn} \\ \end{bmatrix} \quad $ Matrix of symmetric functions
where for $\mathbf {1 \mathop \le i,j \mathop \le n}$:
\(\displaystyle E_{ij}\) \(=\) \(\displaystyle \paren { -1 }^{n-j} \, e_{n-j} \paren { \set {x_1,\ldots,x_n} \setminus \set {x_i} }\) Definition:Elementary Symmetric Function $e_m \paren {U}$
\(\displaystyle \displaystyle \map {p_j} x\) \(=\) \(\displaystyle \prod_{ k \mathop = 1,\, k \mathop \neq j }^n \paren { x - x_k }\)

Lemma 1

\(\displaystyle V_n\) \(=\) \(\displaystyle V\, D\)
\(\displaystyle V_n^{-1}\) \(=\) \(\displaystyle D^{-1} V^{-1}\) provided the inverses exist: Inverse of Matrix Product

$\Box$

Lemma 2

$ EV = P$
$ V^{-1} = P^{-1} E\quad$ provided $\set {x_1,\ldots,x_n}$ is a set of distinct values.
$ V_n^{-1} = D^{-1} V^{-1}\quad$ provided $\set {x_1,\ldots,x_n}$ is a set of distinct values, all nonzero.

Proof of Lemma 2

Matrix multiply establishes $EV=P$, provided:

$(1)\quad \sum_{k = 1}^n E_{ik}\, x_j^{k-1} = \begin{cases} 0 & i \neq j \\ \map {p_i} {x_i} & i = j \end{cases} $

Polynomial $\map {p_i} {x}$ is zero for $x \in \set {x_1,\ldots,x_n} \setminus \set {x_i}$.

Then (1) is equivalent to

$(2)\quad \sum_{k = 1}^n \paren { -1 }^{n-k} e_{n-k} \paren { \set {x_1,\ldots,x_n} \setminus \set {x_i} } \, x_j^{k-1} = \map {p_i} {x_j} $

Apply Viete's Formulas to degree $n-1$ monic polynomial $\map {p_i} {\mathbf {u} }$:

$(3)\quad \sum_{k = 1}^{n} \paren {-1}^{n-k} e_{n-k} \paren { \set {x_1,\ldots,x_n} \setminus \set {x_i} } \, {\mathbf {u} }^{k-1} = \map {p_i} {\mathbf {u} } $

Substitute ${\mathbf {u} } = x_j$ into (3), proving (2) holds.

Then (2) and (1) hold, proving $EV=P$.

Assume $\set {x_1,\ldots,x_n}$ is a set of distinct values.

Then $\det \paren {P}$ is nonzero.

By Matrix is Invertible iff Determinant has Multiplicative Inverse, $P$ has an inverse $P^{-1}$.

Multiply $EV=P$ by $P^{-1}$, then:

$V^{-1} = P^{-1} E\quad$ Left or Right Inverse of Matrix is Inverse

Similarly, $D^{-1}$ exists provided $\set {x_1,\ldots,x_n}$ is a set of nonzero values.

Then $V_n^{-1} = D^{-1} V^{-1}$ by Lemma 1.


$\Box$

Proof of the Theorem

Assume $\set {x_1,\ldots,x_n}$ is a set of distinct values.

Let $d_{ij}$ denote the entries in $V^{-1}$. Then:

\(\displaystyle V^{-1}\) \(=\) \(\displaystyle P^{-1} E\) Lemma 2
\(\displaystyle d_{ij}\) \(=\) \(\displaystyle \dfrac{ E_{ij} }{\map {p_i} {x_i} }\) Matrix multiply
\(\displaystyle \) \(=\) \(\displaystyle \dfrac { \displaystyle \paren { -1 }^{n-j} \, e_{n-j} \paren { \set {x_1,\ldots,x_n} \setminus \set {x_i} } } { \displaystyle \map {p_i} {x_i} }\) Definition 1

Then:

\(\text {(4)}: \quad\) \(\displaystyle d_{ij}\) \(=\) \(\displaystyle \paren { -1 }^{n-j} \, \dfrac { \displaystyle \sum_{ \substack { 1 \mathop \le m_1 \mathop \lt \cdots \mathop \lt m_{n-j} \mathop \le n \\ m_1,\ldots,m_{n-j} \mathop \neq i } } x_{m_1} \cdots x_{m_{n-j} } } { \displaystyle \prod_{ \substack { 1 \mathop \le m \mathop \le n \\ m \mathop \neq i } } \paren { x_i - x_m } }\) Definition:Elementary Symmetric Function $e_m \paren {U}$

Definition 1, equation for $\map {p_i} {x}$

Assume $\set {x_1,\ldots,x_n}$ is a set of distinct values, all nonzero.

Let $b_{ij}$ denote the entries in $V_n^{-1}$. Then:

\(\displaystyle V_n^{-1}\) \(=\) \(\displaystyle D^{-1} V^{-1}\) Lemma 1
\(\displaystyle b_{ij}\) \(=\) \(\displaystyle \dfrac{ 1 }{ x_i} \, d_{ij}\) Matrix multiply

Factor $\paren {-1}^{n-1}$ from the denominator of (4) to agree with Knuth (1997).

$\blacksquare$


Sources