Vandermonde Matrix Identity for Hilbert Matrix
![]() | This article needs to be tidied. Please fix formatting and $\LaTeX$ errors and inconsistencies. It may also need to be brought up to our standard house style. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{Tidy}} from the code. |
Theorem
Define polynomial root sets $\set {1, 2, \ldots, n}$ and $\set {0, -1, \ldots, -n + 1}$ for Definition:Cauchy Matrix.
Let $H$ be the Hilbert matrix of order $n$:
- $H = \begin {pmatrix} 1 & \dfrac 1 2 & \cdots & \dfrac 1 n \\ \dfrac 1 2 & \dfrac 1 3 & \cdots & \dfrac 1 {n + 1} \\ \vdots & \vdots & \cdots & \vdots \\ \dfrac 1 n & \dfrac 1 {n + 1} & \cdots & \dfrac 1 {2 n - 1} \end {pmatrix}$
Then from Vandermonde Matrix Identity for Cauchy Matrix and Hilbert Matrix is Cauchy Matrix:
- $H = -P V_x^{-1} V_y Q^{-1}$
where $V_x$, $V_y$ are Vandermonde matrices:
- $V_x = \begin {pmatrix} 1 & 1 & \cdots & 1 \\ 1 & 2 & \cdots & n \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 2^{n - 1} & \cdots & n^{n -1 } \\ \end {pmatrix}, \quad V_y = \begin {pmatrix} 1 & 1 & \cdots & 1 \\ 0 & -1 & \cdots & -n + 1 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & \paren {-1}^{n - 1} & \cdots & \paren {-n + 1}^{n - 1} \\ \end {pmatrix}$
and $P$, $Q$ are diagonal matrices:
- $P = \begin {pmatrix} \map {p_1} 1 & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & \map {p_n} n \\ \end {pmatrix}, \quad Q = \begin {pmatrix} \map p 0 & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & \map p {-n + 1} \\ \end {pmatrix}$
Definitions of polynomials $p$, $p_1$, $\ldots$, $p_n$:
- $\ds \map p x = \prod_{i \mathop = 1}^n \paren {x - i}$
- $\ds \map {p_k} x = \dfrac {\map p x} {x - k} = \prod_{i \mathop = 1, i \mathop \ne k}^n \, \paren {x - i}$, $1 \mathop \le k \mathop \le n$
Proof
Apply Vandermonde Matrix Identity for Cauchy Matrix and Hilbert Matrix is Cauchy Matrix.
Matrices $V_x$ and $V_y$ are invertible by Inverse of Vandermonde Matrix.
Matrices $P$ and $Q$ are invertible because all diagonal elements are nonzero.
$\blacksquare$
Examples
$3 \times 3$ Matrix
Define polynomial root sets $\set {1, 2, 3}$ and $\set {0, -1, -2}$ for Definition:Cauchy Matrix because Hilbert Matrix is Cauchy Matrix.
Illustrate $3\times 3$ case for Vandermonde Matrix Identity for Hilbert Matrix and value of Hilbert matrix determinant:
\(\ds H\) | \(=\) | \(\ds {\begin{pmatrix} \frac 1 1 & \frac 1 2 & \frac 1 3 \\ \frac 1 2 & \frac 1 3 & \frac 1 4 \\ \frac 1 3 & \frac 1 4 & \frac 1 5 \\ \end{pmatrix} }\) | Hilbert matrix of order $3$ |
Then:
\(\ds H\) | \(=\) | \(\ds -P V_x^{-1} V_y Q^{-1}\) | Vandermonde Matrix Identity for Hilbert Matrix | |||||||||||
\(\ds \map \det H\) | \(=\) | \(\ds \dfrac 1 {2140}\) | Determinant of Matrix Product |
Also see
Sources
- 1944: A.C. Aitken: Determinants and Matrices (3rd ed.): Chapter $\text{VI}$. $47$: Alternant Matrices and Determinants
- March 1992: Roderick Gow: Cauchy's matrix, the Vandermonde matrix and polynomial interpolation (Bull. Irish Math. Soc. Vol. 28: pp. 45 – 52)