Summation of Powers over Product of Differences/Proof 3

From ProofWiki
Jump to navigation Jump to search







Theorem

$\ds \sum_{j \mathop = 1}^n \begin{pmatrix} {\dfrac { {x_j}^r} {\ds \prod_{\substack {1 \mathop \le k \mathop \le n \\ k \mathop \ne j} } \paren {x_j - x_k} } } \end{pmatrix} = \begin{cases} 0 & : 0 \le r < n - 1 \\ 1 & : r = n - 1 \\ \ds \sum_{j \mathop = 1}^n x_j & : r = n \end{cases}$


Proof

Definition

\(\ds \map p x\) \(=\) \(\ds \prod_{k \mathop = 1}^n \paren {x - x_k}\)
\(\ds \map {p_m} x\) \(=\) \(\ds \prod_{k \mathop = 1, \, k \mathop \ne m}^n \paren {x - x_k},\quad 1 \le m \le n\)
\(\ds A\) \(=\) \(\ds \begin{pmatrix} 1 & x_1 & \cdots & x_1^{n - 2} & x_1^{n - 1} \\
 \vdots & \vdots & \ddots &   \vdots    & \vdots \\
 1      & x_n    & \cdots & x_n^{n - 2} & x_n^{n - 1} \\
 \end{pmatrix}\)
where $\set {x_1, \ldots, x_n}$ is assumed to have distinct elements
\(\ds A_{\vec Y}\) \(=\) \(\ds \begin{pmatrix}
 1      & x_1    & \cdots & x_1^{n - 2} & y_1 \\
 \vdots & \vdots & \ddots &   \vdots    & \vdots \\
 1      & x_n    & \cdots & x_n^{n  -2} & y_n \\

\end{pmatrix}\)

Vector $\vec Y$ has entries $y_1,\ldots,y_n$.

Symbol $\map {\mathbf {Cof } } {M, i, j}$ denotes cofactor $M_{i j}$ of matrix $M$


Lemma 1

\(\ds \map \det A\) \(=\) \(\ds \map {p_m} {x_m} \map {\mathbf {Cof} } {A, m, n}\)

Proof of Lemma 1

By Effect of Elementary Row Operations on Determinant, the determinant of $A$ is unchanged by adding a linear combination of the first $n - 1$ columns to the last column.

Let $\map f x$ be the monic polynomial $\map {p_m} x = x^{n - 1} + $ lower power terms.

Then:

\(\ds \det \begin {pmatrix}
 1      & x_1    & \cdots & x_1^{n - 2} & x_1^{n - 1} \\
 \vdots & \vdots & \ddots &   \vdots    & \vdots \\
 1      & x_n    & \cdots & x_n^{n - 2} & x_n^{n - 1} \\
 \end {pmatrix}\)
\(=\) \(\ds \det \begin {pmatrix}
 1      & x_1    & \cdots & x_1^{n - 2} & \map f {x_1} \\
 \vdots & \vdots & \ddots &   \vdots    & \vdots \\
 1      & x_n    & \cdots & x_n^{n - 2} & \map f {x_n} \\
\end {pmatrix}\)

The last column on the right has all components zero except $m$th entry $\map {p_m} {x_m}$.

Apply cofactor expansion along column $n$.

$\Box$


Lemma 2

\(\text {(1)}: \quad\) \(\ds \map \det A\) \(=\) \(\ds \prod_{1 \mathop \le i \mathop < j \mathop \le n} \paren {x_j - x_i}\)
\(\text {(2)}: \quad\) \(\ds \map {\mathbf {Cof} } {A, m, n}\) \(=\) \(\ds \paren {-1}^{m + n} \prod_{1 \mathop \le i \mathop < j \mathop \le n, \, i \mathop \ne m, \, j \mathop \ne m} \paren {x_j - x_i}\)


Proof of Lemma 2:

Value of Vandermonde Determinant establishes $(1)$.

To prove (2), let:

$\set {y_1, \ldots, y_{n - 1} } = \set {x_1, \ldots, x_n} \setminus \set {x_m}$

then apply $(1)$ with $\set {x_1, \ldots, x_n}$ replaced by $\set {y_1, \ldots, y_{n - 1} }$.

$\Box$


Theorem Details:

The Lemmas change the Theorem's equation to:

\(\text {(3)}: \quad\) \(\ds \sum_{m \mathop = 1}^n {x_m^r} \, \dfrac {\map {\mathbf {Cof} } {A, m, n} } {\map \det A}\) \(=\) \(\ds \begin{cases}

0 & : 0 \mathop \le r \mathop < n - 1 \\ 1 & : r = n - 1 \\ \sum_{j \mathop = 1}^n x_j & : r = n \\

   \end{cases}\)

Cofactor expansion changes identity $(3)$ to:

\(\text {(4)}: \quad\) \(\ds \det \begin{pmatrix}

1 & x_1 & \cdots & x_1^{n - 2} & x_1^r \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 1 & x_n & \cdots & x_n^{n - 2} & x_n^r \\

   \end{pmatrix}\)
\(=\) \(\ds \map \det A \begin{cases}

0 & : 0 \mathop \le r \mathop < n - 1 \\ 1 & : r = n - 1 \\ \sum_{j \mathop = 1}^n x_j & : r = n \\

   \end{cases}\)

The proof is divided into three cases.


Case 1: $0 \mathop \le r \mathop \le n - 2$:

The determinant on the left in $(4)$ is zero by Square Matrix with Duplicate Rows has Zero Determinant.


Case 2: $r = n - 1$:

The determinant on the left in $(4)$ is $\map \det A$.


Case 3: $r = n$:

Expand $\map p x = \prod_{k \mathop = 1}^n \paren {x - x_k}$ by Taylor's Theorem with remainder $R$ of degree $n - 2$:

$\map p x = x^n - \paren {x_1 + \cdots + x_n} x^{n - 1} + \map R x$

Let $x = x_k$ for $1 \mathop \le k \mathop \le n$.

Then $x$ is a root of $\map p x = \prod_{k \mathop = 1}^n \paren {x - x_k}$:

\(\ds 0\) \(=\) \(\ds \map p {x_k}\)
\(\ds \) \(=\) \(\ds x_k^n - \paren {x_1 + \cdots + x_n} x_k^{n - 1} + \map R {x_k}\)

Let:

\(\ds c\) \(=\) \(\ds x_1 + \cdots + x_n\)
\(\ds \vec u\) \(=\) \(\ds \begin {pmatrix} x_1^n \\ \vdots \\ x_n^n \end{pmatrix}\) Column $n$ of the left side of $(4)$
\(\ds \vec z\) \(=\) \(\ds c \begin {pmatrix} x_1^{n - 1} \\ \vdots \\ x_n^{n - 1} \end{pmatrix}\) Constant $c$ times column $n$ of $A$
\(\ds \vec w\) \(=\) \(\ds \begin{pmatrix} -\map R {x_1} \\ \vdots \\ -\map R {x_n} \end{pmatrix}\)


Then:

\(\text {(5)}: \quad\) \(\ds \vec u\) \(=\) \(\ds \vec z + \vec w\) by equation $x_k^n = c x_k^{n - 1} - \map R {x_k}$
\(\text {(6)}: \quad\) \(\ds \map \det {A_{\vec z} }\) \(=\) \(\ds c \map \det A\) Effect of Elementary Row Operations on Determinant: factoring $c$ from last column of $A_{\vec z}$
\(\text {(7)}: \quad\) \(\ds \det \paren { A_{\vec w} }\) \(=\) \(\ds 0\) Effect of Elementary Row Operations on Determinant: $\vec w$ is a linear combination of columns $1$ to $n - 1$

The left side of $(4)$:

\(\ds \map \det {A_{\vec u} }\) \(=\) \(\ds \map \det {A_{\vec z} } + \map \det {A_{\vec w} }\) by $(5)$ and Determinant as Sum of Determinants
\(\ds \) \(=\) \(\ds c \map \det A + 0\) by $(6)$ and $(7)$
\(\ds \) \(=\) \(\ds \map \det A \displaystyle \sum_{i \mathop = 1}^n x_i\) definition of $c$

$\blacksquare$


Also see

Sum of Elements in Inverse of Cauchy Matrix

Vandermonde Matrix Identity for Cauchy Matrix


Sources