Inverse of Vandermonde Matrix

Theorem
Let $V_n$ be the Vandermonde's matrix of order $n$ given by:


 * $V_n = \begin{bmatrix}

x_1 & x_2 & \cdots & x_n \\ x_1^2 & x_2^2 & \cdots & x_n^2 \\ \vdots & \vdots & \ddots & \vdots \\ x_1^n & x_2^n & \cdots & x_n^n \end{bmatrix}$

Then its inverse $V_n^{-1} = \left[{b}\right]_n$ can be specified as:


 * $b_{ij} = \dfrac {\displaystyle \sum_{\stackrel{1 \le k_1 < \ldots < k_{n-j} \le n} {k_1, \ldots, k_{n-j} \ne i}} \left({-1}\right)^{j-1} x_{k_1} \ldots x_{k_{n-j}}} {\displaystyle x_i \prod_{\stackrel {1 \le k \le n} {k \ne i}} \left({x_k - x_i}\right)}$

Proof
The following proof should be adapted to the special form of the theorem or the theorem reformulated in a more appropriate manner. In its current state, the proof considers the classical form of the Vandermonde matrix:


 * $V_n = \begin{bmatrix}

1& x_1 & \cdots & x_1^n \\ 1& x_2 & \cdots & x_2^n \\ \vdots & \vdots & \ddots & \vdots \\ 1& x_n & \cdots & x_n^n \\ \end{bmatrix}$

Noting $B=[b_{ij}]$ the inverse matrix, which existence is granted thanks to the non vanishing determinant of $V$: $det(V)=\prod_{1\leq i<j\leq n}(x_i -x_j)\neq 0$,

$B$ checks the following linear system: $\sum_{k=1}^n b_{kj}x_i^k=\delta_{ij}$.

Therefore, $\sum_{k=1}^n b_{kj}x^k$ interpolates $(x_1,0)\dots(x_{j-1},0),(x_j,1),(x_{j+1},0),\dots(x_n,0)$.

In other words, the $j^\text{Th}$ row of $B$ is composed of the coefficients of the $j^\text{Th}$ polynom of the Lagrange interpolation basis:

$\forall j, \forall x, \displaystyle\sum_{k=1}^n b_{kj}x^k=\prod_{\stackrel {1 \le m \le n} {m \ne j}}\frac{x-x_m}{x_j-x_m}$.

Identifying the $k^\text{Th}$ order coefficient in these two polynomial representations yields:

$b_{kj}=(-1)^{n-k}\dfrac{ \displaystyle\sum_{\stackrel{1 \le m_1 < \ldots < m_{n-j} \le n} {m_1, \ldots, m_{n-k} \ne j} } x_{m_1}\dots x_{m_{n-k}} } { \displaystyle\prod_{ \stackrel {1 \le k \le n} {k \ne i} } \left({x_k - x_i}\right) }$