Talk:Inverse of Vandermonde Matrix

Does anyone know if Knuth give details of the proof? Then line:

Identifying the $k^\text{th}$ order coefficient in these two polynomials yields:
 * $b_{kj} = (-1)^{n-k-1} \left({\dfrac{\displaystyle \sum_{\stackrel{0 \mathop \le m_0 \mathop < \ldots \mathop < m_{n-k} \mathop \le n} {m_0, \ldots, m_{n-k} \mathop \ne j} } x_{m_0} \cdots x_{m_{n-k}} } {\displaystyle \prod_{\stackrel {0 \mathop \le m \mathop \le n} {m \mathop \ne j} } \left({x_j - x_m}\right)}}\right)$

isn't correct (I think); the indices are one out of sync, and I can't think of a tidy way of correcting it without defining the empty sum to be $1$, not $0$. --Linus44 (talk) 23:20, 12 October 2012 (UTC)


 * Actually it's fine. I'm going to stop doing maths at half 1 now. --Linus44 (talk) 23:22, 12 October 2012 (UTC)

The matrix $V_n$ is $n \times n$ and the matrix $W_n$ is $(n+1) \times (n+1)$... You can't write $V_n = diag(x_i) W_n$ with $i = 1, ..., n $. --Ostrogradsky (talk) 14:16, 12 March 2015 (UTC)


 * Good call. I'll put a maintenance tag on it as that needs to be addressed. Thanks. --prime mover (talk) 17:04, 12 March 2015 (UTC)

I would be very grateful if someone could check for mistakes in the changes that I've made although I've looked really carefully and checked results with MATLAB. One can never be too certain--Riddler (talk) 21:55, 11 May 2015 (UTC)


 * Okay, thx -- I'll go over it in due course and give it a rewrite so as to put it back into its original form (as found in Knuth). --prime mover (talk) 22:03, 11 May 2015 (UTC)

Proposed new proof
The current proof is hard to read, so I wrote up a simpler one. Haven't contributed to proofwiki before, so I decided to go easy and just put it here on the talk page. If nobody objects or otherwise instructed, then I'll replace the old proof after some time. An admin could do naturally do it right away if all's good. -- User:Tbackstr 4.1.2017


 * PLEASE NOTE: We never replace proofs with new ones unless the old proof is defective. We add new proofs, so that a given theorem may have more than one proof assigned to it.


 * Please feel free to browse -- you will, sooner or later, find one or two theorems with multiple proofs. Then you will see how it is arranged. --prime mover (talk) 15:26, 4 January 2017 (EST)

Theorem
Let $V$ be the Vandermonde matrix of order $n$ given by:


 * $V = \begin{bmatrix}

1 & 1 & \cdots & 1 \\ x_1 & x_2 & \cdots & x_n \\ x_1^2 & x_2^2 & \cdots & x_n^2 \\ \vdots & \vdots & \ddots & \vdots \\ x_1^{n-1} & x_2^{n-1} & \cdots & x_n^{n-1} \end{bmatrix}$

Then its inverse $V^{-1} = B$ can be specified as:


 * $\displaystyle B_{k,h} = (-1)^{h-1} \frac{\sum_{\substack{1\leq m_1 < m_2 < \cdots < m_{N+1-h}\leq N\\m_i\neq k}} x_{m_1}x_{m_2}\cdots x_{N+1-h}}{\prod_{\substack{i=1\\i\neq h}}^N (x_i - x_h)}$

The inverse exists if $x_k\neq x_h$ for all $k\neq h$.

Proof
We will proceed in two steps, 1. create a matrix which diagonalizes $V$ when multiplied from the left and 2. scale the diagonal matrix to unity.

Define matrices $A_k(z)$ with coefficients $a_{k,l}$ such that they have the zeros $x_h$, for all $h$ except $h=k$:
 * $\displaystyle A_k(z) := \sum_{h=1}^N a_{k,h} z^{h-1} := \prod_{\substack{h=1\\h\neq k}}^N (x_h - z).$

where the coefficients are defined as
 * $\displaystyle a_{k,h} := (-1)^{h-1} \sum_{\substack{1\leq m_1 < m_2 < \cdots < m_{N+1-h}\leq N\\m_i\neq k}} x_{m_1}x_{m_2}\cdots x_{N+1-h}.$

It follows the $A_k(x_h)=0$ for all $h\neq k$ and if $A$ is the matrix with coefficients $a_{k,l}$, then
 * $AV = \begin{bmatrix}

A_1(x_1) & 0 & \cdots & 0 \\ 0 & A_2(x_2) & \ddots & \vdots \\ \vdots & \ddots & \ddots & 0 \\ 0 & \cdots & 0 & A_n(x_n) \end{bmatrix}.$ By defining a diagonal matrix $D$ as
 * $D = \begin{bmatrix}

A_1^{-1}(x_1) & 0 & \cdots & 0 \\ 0 & A_2^{-1}(x_2) & \ddots & \vdots \\ \vdots & \ddots & \ddots & 0 \\ 0 & \cdots & 0 & A_n^{-1}(x_n) \end{bmatrix},$ we can define the inverse as
 * $B:=DA.$

Note that the scalar inverses in $D$ exist if $A_k(x_k)\neq 0$, that is, if $x_k\neq x_h$ for all $k\neq h$. Since the determinant of $V$ is
 * $\det(V) = \prod_{1 \le i < j \le n} (x_j - x_i),$

then $D$ is well-defined and the inverse exists only if $\det(V)\neq 0$.

28 Oct, 1 Nov 2019
The statement of the theorem is missing the hypothesis of distinct values. The proof finds the inverse of $W_n$, then finds the inverse of $V_n$ by matrix multiply. The Theorem could contain the inverse identities for both $W_n$ and $V_n$.

Below is a Corollary statement that collects the results actually proved and uses the elementary symmetric functions. The original proof applies and there is no need to edit anything. The symbol $b_{ij}$ matches use in the original proof. Overloading of the symbol in the original proof is a reader's toil, avoided in the Corollary by adding a separate symbol.

My plan: insert the corollary as Inverse of Vandermonde Matrix/Corollary 1 and edit Inverse of Vandermonde Matrix with an include. No proof details to be added. The last two references above by User:Tbackstr 4.1.2017 will be added to Inverse of Vandermonde Matrix.

Corollary 1
Define for variables $\set {y_1,\ldots,y_k}$ elementary symmetric functions:

Let $\set {x_1,\ldots,x_n}$ be a set of distinct values.

Let $W_n$ and $V_n$ be Vandermonde matrices of order $n$:



W_n = \begin{bmatrix} 1        &  x_1      & \cdots  & x_1^{n-1} \\ 1        & x_2       & \cdots  & x_2^{n-1} \\ \vdots     & \vdots    & \ddots  & \vdots    \\ 1        & x_1^{n-1} & \cdots  & x_n^{n-1} \\ \end{bmatrix},\quad V_n = \begin{bmatrix} x_1  & x_2    & \cdots  & x_n    \\ x_1^2 & x_2^2 & \cdots  & x_n^2  \\ \vdots & \vdots & \ddots  & \vdots \\ x_1^n & x_2^n & \cdots  & x_n^n  \\ \end{bmatrix} $

Let matrix inverses be written as $W_n^{-1} = \begin{bmatrix} b_{ij} \end{bmatrix}$ and $V_n^{-1} = \begin{bmatrix} c_{ij} \end{bmatrix}$. Then: