Inverse of Cauchy Matrix

Theorem
Let $C_n$ be the square Cauchy matrix of order $n$:


 * $C_n = \begin{bmatrix}

\dfrac 1 {x_1 + y_1} & \dfrac 1 {x_1 + y_2} & \cdots & \dfrac 1 {x_1 + y_n} \\ \dfrac 1 {x_2 + y_1} & \dfrac 1 {x_2 + y_2} & \cdots & \dfrac 1 {x_2 + y_n} \\ \vdots & \vdots & \ddots & \vdots \\ \dfrac 1 {x_n + y_1} & \dfrac 1 {x_n + y_2} & \cdots & \dfrac 1 {x_n + y_n} \\ \end{bmatrix}$

Then its inverse $C_n^{-1} = \sqbrk b_n$ can be specified as:


 * $\begin{bmatrix} b_{ij} \end{bmatrix} = \begin{bmatrix} \dfrac {\ds \prod_{k \mathop = 1}^n \paren {x_j + y_k} \paren {x_k + y_i} } {\ds \paren {x_j + y_i} \paren {\prod_{\substack {1 \mathop \le k \mathop \le n \\ k \mathop \ne j} } \paren {x_j - x_k} } \paren {\prod_{\substack {1 \mathop \le k \mathop \le n \\ k \mathop \ne i} } \paren {y_i - y_k} } } \end{bmatrix}$

Proof
Preliminaries:

Vandermonde Matrix Identity for Cauchy Matrix supplies matrix equation


 * $(1): \quad -C = PV_x^{-1} V_y Q^{-1}$


 * Definitions of symbols:


 * $V_x = \paren {\begin{smallmatrix}

1        & 1         & \cdots & 1 \\ x_1      & x_2       & \cdots & x_n \\ \vdots   & \vdots    & \ddots & \vdots \\ x_1^{n-1} & x_2^{n-1} & \cdots & x_n^{n-1} \\ \end{smallmatrix} },\quad V_y = \paren {\begin{smallmatrix} 1        & 1         & \cdots & 1 \\ y_1      & y_2       & \cdots & y_n \\ \vdots   & \vdots    & \ddots & \vdots \\ y_1^{n-1} & y_2^{n-1} & \cdots & y_n^{n-1} \\ \end{smallmatrix} }$


 * $P = \paren {\begin{smallmatrix}

p_1(x_1) & \cdots & 0 \\ \vdots  & \ddots  & \vdots \\ 0       & \cdots  & p_n(x_n) \\ \end{smallmatrix} }, \quad Q = \paren {\begin{smallmatrix} p(y_1) & \cdots  & 0 \\ \vdots & \ddots  & \vdots \\ 0      & \cdots  & p(y_n) \\ \end{smallmatrix} }$


 * $\ds \map p x = \prod_{i \mathop = 1}^n \paren {x - x_i},

\quad \map {p_k} x = \prod_{i \mathop = 1, i \mathop \ne k}^n \, \paren {x - x_i}, \quad 1 \mathop \le k \mathop \le n$ Polynomials

Compute the inverse $C^{-1}$ for:

Replacement $y_k \to -y_k$ then gives the inverse $C_n^{-1}$ in the theorem.

Inverse of Matrix Product applied to equation (1) gives:


 * $ C^{-1} = -Q V_y^{-1} V_x P^{-1}$

Let ${\vec K}_1,\ldots,{\vec K}_n$ denote the columns of the $n\times n$ identity matrix.

Then $n\times n$ matrix $B = C^{-1}$ has entries $b_{ij} = {\vec K}_i^T C^{-1} {\vec K}_j$.

Hold fixed until the end of the proof the row and column index symbols $i$ and $j$.

Define column vectors $\vec A$, $\vec B$ so that $b_{ij} = {\vec A}^T \vec B$:


 * $\vec A = \paren { {\vec K}_i^T \, Q \, V_y^{-1} }^T,

\quad \vec B = -V_x \, P^{-1} \, {\vec K}_j$ Define $u = x_j$ and simplify:

Then:

Simplify the fraction on the right:

Define sets $D,A,B,C$ with disjoint decomposition $D = A \cup B \cup C$:

Use $\prod_D = \prod_A \prod_B \prod_C$ to convert the numerator and denominator in the of $(2)$.

Then:

Replacement of ${ \map p {y_i} }$ and ${ \map {p_j} {x_j} }$ by products gives:

After substituting $y_k \to -y_k$ and simplifying, Knuth's original identity (1997) becomes:

In $(3)$, factor $\paren {-1}^{n + 1}$ from the numerator and $\paren {-1}^{n - 1}$ from the denominator.

Then $\paren {-1}^{n + 1} = \paren {-1}^{n - 1}$ verifies that $(3)$ matches $(4)$.