Inverse of Cauchy Matrix

Theorem
Let $C_n$ be the square Cauchy matrix of order $n$:


 * $C_n = \begin{bmatrix}

\dfrac 1 {x_1 + y_1} & \dfrac 1 {x_1 + y_2} & \cdots & \dfrac 1 {x_1 + y_n} \\ \dfrac 1 {x_2 + y_1} & \dfrac 1 {x_2 + y_2} & \cdots & \dfrac 1 {x_2 + y_n} \\ \vdots & \vdots & \ddots & \vdots \\ \dfrac 1 {x_n + y_1} & \dfrac 1 {x_n + y_2} & \cdots & \dfrac 1 {x_n + y_n} \\ \end{bmatrix}$

Then its inverse $C_n^{-1} = \sqbrk b_n$ can be specified as:


 * $\begin{bmatrix} b_{ij} \end{bmatrix} = \begin{bmatrix} \dfrac {\displaystyle \prod_{k \mathop = 1}^n \paren {x_j + y_k} \paren {x_k + y_i} } {\displaystyle \paren {x_j + y_i} \paren {\prod_{\substack {1 \mathop \le k \mathop \le n \\ k \mathop \ne j} } \paren {x_j - x_k} } \paren {\prod_{\substack {1 \mathop \le k \mathop \le n \\ k \mathop \ne i} } \paren {y_i - y_k} } } \end{bmatrix}$

Proof
Preliminaries:

Vandermonde Matrix Identity for Cauchy Matrix supplies matrix equation


 * $\displaystyle (1)\quad - C = PV_x^{-1} V_y Q^{-1}$


 * Definitions of symbols:


 * $\displaystyle V_x=\paren {\begin{smallmatrix}

1        & 1         & \cdots & 1 \\ x_1      & x_2       & \cdots & x_n \\ \vdots   & \vdots    & \ddots & \vdots \\ x_1^{n-1} & x_2^{n-1} & \cdots & x_n^{n-1} \\ \end{smallmatrix} },\quad V_y=\paren {\begin{smallmatrix} 1        & 1         & \cdots & 1 \\ y_1      & y_2       & \cdots & y_n \\ \vdots   & \vdots    & \ddots & \vdots \\ y_1^{n-1} & y_2^{n-1} & \cdots & y_n^{n-1} \\ \end{smallmatrix} }$ Vandermonde matrices


 * $\displaystyle P= \paren {\begin{smallmatrix}

p_1(x_1) & \cdots & 0 \\ \vdots  & \ddots  & \vdots \\ 0       & \cdots  & p_n(x_n) \\ \end{smallmatrix} }, \quad Q= \paren {\begin{smallmatrix} p(y_1) & \cdots  & 0 \\ \vdots & \ddots  & \vdots \\ 0      & \cdots  & p(y_n) \\ \end{smallmatrix} }$ Diagonal matrices


 * $\displaystyle

p(x) = \prod_{i \mathop = 1}^n \paren {x - x_i}, \quad \displaystyle p_k(x) = \prod_{i \mathop = 1,i \mathop \ne k}^n \, \paren {x - x_i}, \quad 1 \mathop \le k \mathop \le n$ Polynomials

Compute the inverse $C^{-1}$ for:

Replacement $y_k \to -y_k$ then gives the inverse $C_n^{-1}$ in the theorem.

Inverse of Matrix Product applied to equation (1) gives:


 * $ C^{-1} = -Q V_y^{-1} V_x P^{-1}$

Let ${\vec K}_1,\ldots,{\vec K}_n$ denote the columns of the $n\times n$ identity matrix.

Then $n\times n$ matrix $B = C^{-1}$ has entries $b_{ij} = {\vec K}_i^T C^{-1} {\vec K}_j$.

Hold fixed until the end of the proof the row and column index symbols $i$ and $j$.

Define column vectors $\vec A$, $\vec B$ so that $b_{ij} = {\vec A}^T \vec B$:


 * $\displaystyle

\vec A = \paren { {\vec K}_i^T \, Q \, V_y^{-1} }^T, \quad \vec B = -V_x \, P^{-1} \, {\vec K}_j $ Define $u = x_j$ and simplify:

Then:

Simplify the fraction on the right:

Define sets $D,A,B,C$ with disjoint decomposition $D = A \cup B \cup C$:

Use $\prod_{D} = \prod_{A} \prod_{B} \prod_{C}$ to convert the numerator and denominator in the right side of (2).

Then:

Replacement of ${ \map p {y_i} }$ and ${ \map {p_j} {x_j} }$ by products gives:

After replacement $y_k \to -y_k$ and canceling a common factor, Knuth's original identity (1997) becomes:

In (3), factor $(-1)^{n+1}$ from the numerator and $(-1)^{n-1}$ from the denominator.

Then $(-1)^{n+1} = (-1)^{n-1}$ verifies that (3) matches (4).