Vandermonde Matrix Identity for Cauchy Matrix

Theorem
Assume values $\set {x_1, \ldots, x_n, y_1, \ldots, y_n}$ are distinct in matrix

Then:

Definitions of Vandermonde matrices $V_x$, $V_y$ and diagonal matrices $P$, $Q$:


 * $V_x = \begin {pmatrix}

1        & 1         & \cdots & 1 \\ x_1      & x_2       & \cdots & x_n \\ \vdots   & \vdots    & \ddots & \vdots \\ {x_1}^{n - 1} & {x_2}^{n - 1} & \cdots & {x_n}^{n - 1} \\ \end {pmatrix}, \quad V_y = \begin {pmatrix} 1        & 1         & \cdots & 1 \\ y_1      & y_2       & \cdots & y_n \\ \vdots   & \vdots    & \ddots & \vdots \\ {y_1}^{n - 1} & {y_2}^{n - 1} & \cdots & {y_n}^{n - 1} \\ \end {pmatrix}$ Vandermonde matrices


 * $P = \begin {pmatrix}

\map {p_1} {x_1} & \cdots & 0 \\ \vdots  & \ddots  & \vdots \\ 0       & \cdots  & \map {p_n} {x_n} \\ \end {pmatrix}, \quad Q = \begin {pmatrix} \map p {y_1} & \cdots  & 0 \\ \vdots & \ddots  & \vdots \\ 0      & \cdots  & \map p {y_n} \\ \end {pmatrix}$ Diagonal matrices

Definitions of polynomials $p, p_1, \ldots, p_n$:


 * $\ds \map p x = \prod_{i \mathop = 1}^n \paren {x - x_i}$


 * $\ds \map {p_k} x = \dfrac {\map p x} {x - x_k} = \prod_{i \mathop = 1, i \mathop \ne k}^n \paren {x - x_i}$, $1 \mathop \le k \mathop \le n$

Proof
Matrices $P$ and $Q$ are invertible because all diagonal elements are nonzero.

For $1 \le i \le n$ express polynomial $p_i$ as:


 * $\ds \map {p_i} x = \sum_{k \mathop = 1}^n a_{i k} x^{k - 1}$

Then:

Use second equation $\map {p_i} {y_j} = \dfrac {\map p {y_j} } {y_j - x_i}$:

Also see

 * Definition:Vandermonde Determinant


 * Inverse of Vandermonde Matrix


 * Hilbert Matrix is Cauchy Matrix


 * Inverse of Cauchy Matrix


 * Inverse of Hilbert Matrix


 * Value of Cauchy Determinant


 * Sum of Elements in Inverse of Cauchy Matrix


 * Sum of Elements in Inverse of Hilbert Matrix