Inverse of Vandermonde Matrix

Theorem
Let $V_n$ be the Vandermonde's matrix of order $n$ given by:


 * $V_n = \begin{bmatrix}

x_1 & x_2 & \cdots & x_n \\ x_1^2 & x_2^2 & \cdots & x_n^2 \\ \vdots & \vdots & \ddots & \vdots \\ x_1^n & x_2^n & \cdots & x_n^n \end{bmatrix}$

Then its inverse $V_n^{-1} = \left[{b}\right]_n$ can be specified as:


 * $b_{ij} = \dfrac {\displaystyle \sum_{\stackrel{1 \le k_1 < \ldots < k_{n-j} \le n} {k_1, \ldots, k_{n-j} \ne i}} \left({-1}\right)^{j-1} x_{k_1} \ldots x_{k_{n-j}}} {\displaystyle x_i \prod_{\stackrel {1 \le k \le n} {k \ne i}} \left({x_k - x_i}\right)}$

Proof
The following proof should be adapted to the special form of the theorem or the theorem reformulated in a more appropriate manner. In its current state, the proof considers the classical form of the Vandermonde matrix:


 * $V_n = \begin{bmatrix}

1& x_0 & \cdots & x_0^n \\ 1& x_1 & \cdots & x_1^n \\ \vdots & \vdots & \ddots & \vdots \\ 1& x_n & \cdots & x_n^n \\ \end{bmatrix}$

Noting $B=[b_{ij}]$ the inverse matrix, which existence is granted thanks to the non vanishing determinant of $V$:
 * $\displaystyle \det(V)=\prod_{0\leq i<j\leq n}(x_i -x_j)\neq 0$

$B$ checks the following linear system:
 * $\displaystyle \sum_{k=0}^n b_{kj}x_i^k=\delta_{ij}$

Therefore:
 * $\displaystyle \sum_{k=0}^n b_{kj}x^k$ interpolates $(x_0,0), \ldots, (x_{j-1},0), (x_j,1), (x_{j+1},0), \ldots, (x_n,0)$

In other words, the $j^\text{th}$ row of $B$ is composed of the coefficients of the $j^\text{th}$ polynomial of the Lagrange interpolation basis:


 * $\forall j: \forall x: \displaystyle\sum_{k=0}^n b_{kj}x^k=\prod_{\stackrel {0 \le m \le n} {m \ne j}}\frac{x-x_m}{x_j-x_m}$

Identifying the $k^\text{th}$ order coefficient in these two polynomial representations yields:


 * $b_{kj}=(-1)^{n-k}\left({\dfrac{\displaystyle \sum_{\stackrel{0 \le m_0 < \ldots < m_{n-j} \le n} {m_0, \ldots, m_{n-k} \ne j} } x_{m_0}\dots x_{m_{n-k}} } {\displaystyle \prod_{\stackrel {0 \le k \le n} {k \ne i} } \left({x_k - x_i}\right)}}\right)$