# P-Product Metrics on Real Vector Space are Topologically Equivalent

Jump to: navigation, search

## Theorem

For $n \in \N$, let $\R^n$ be an Euclidean space.

Let $p \in \R_{\ge 1}$.

Let $d_p$ be the $p$-product metric on $\R^n$.

Let $d_\infty$ be the Chebyshev distance on $\R^n$.

Then $d_p$ and $d_\infty$ are topologically equivalent.

## Proof

Let $r, t \in \R_{\ge 1}$.

Without loss of generality, assume that $r \le t$.

For all $x, y \in \R^n$, we are going to show that:

$\displaystyle d_r \left({x, y}\right) \ge d_\infty \left({x, y}\right) \ge n^{-1} d_r \left({x, y}\right)$

Then we can demonstrate Lipschitz equivalence between all of these metrics, from which topological equivalence follows.

Let $d_r$ be the metric defined as $\displaystyle d_r \left({x, y}\right) = \left({\sum_{i \mathop = 1}^n \left|{x_i - y_i}\right|^r}\right)^{1/r}$.

### Inequalities for Chebyshev Distance

By definition of the Chebyshev distance on $\R^n$, we have:

$\displaystyle d_\infty \left({x, y}\right) = \max_{i \mathop = 1}^n {\left\vert{x_i - y_i}\right\vert}$

where $x = \left({x_1, x_2, \ldots, x_n}\right)$ and $y = \left({y_1, y_2, \ldots, y_n}\right)$.

Let $j$ be chosen so that:

$\displaystyle \left\vert{x_j - y_j}\right\vert = \max_{i \mathop = 1}^n {\left\vert{x_i - y_i}\right\vert}$

Then:

 $\displaystyle d_\infty \left({x, y}\right)$ $=$ $\displaystyle \left({ \left\vert{x_j - y_j}\right\vert^p }\right)^{1/p}$ $\quad$ $\quad$ $\displaystyle$ $\le$ $\displaystyle \left({\sum_{i \mathop = 1}^n \left\vert{x_i - y_i}\right\vert^p}\right)^{1/p}$ $\quad$ $\quad$ $\displaystyle$ $=$ $\displaystyle d_p \left({x, y}\right)$ $\quad$ $\quad$ $\displaystyle$ $\le$ $\displaystyle \left({n \left\vert{x_j - y_j}\right\vert^p}\right)^{1/p}$ $\quad$ $\quad$ $\displaystyle$ $=$ $\displaystyle n^{1/p} d_\infty \left({x, y}\right)$ $\quad$ $\quad$

$\Box$

### Inequality for General Case

We show that $\displaystyle d_r \left({x, y}\right) \ge d_t \left({x, y}\right)$, which is equivalent to proving that:

$\displaystyle \left({\sum_{i \mathop = 1}^n \left|{x_i - y_i}\right|^r}\right)^{1/r} \ge \left({\sum_{i \mathop = 1}^n \left|{x_i - y_i}\right|^t}\right)^{1/t}$

Let $\forall i \in \left[{1 \,.\,.\, n}\right]: s_i = \left|{x_i - y_i}\right|$.

Suppose $s_k = 0$ for some $k \in \left[{1 \,.\,.\, n}\right]$.

Then the problem reduces to the equivalent one of showing that:

$\displaystyle \left({\sum_{i \mathop = 1}^{n-1} \left|{x_i - y_i}\right|^r}\right)^{1/r} \ge \left({\sum_{i \mathop = 1}^{n-1} \left|{x_i - y_i}\right|^t }\right)^{1/t}$

that is, of reducing the index by $1$.

Note that when $n = 1$, from simple algebra $d_r \left({x, y}\right) = d_t \left({x, y}\right)$.

So, let us start with the assumption that $\forall i \in \left[{1 \,.\,.\, n}\right]: s_i > 0$.

Let $\displaystyle u = \sum_{i \mathop = 1}^n \left|{x_i - y_i}\right|^r = \sum_{i \mathop = 1}^n s_i^r$, and $v = \dfrac 1 r$.

From Derivative of Function to Power of Function‎, $D_r \left({u^v}\right) = v u^{v-1} D_r \left({u}\right) + u^v \ln u D_r \left({v}\right)$.

Here:

$\displaystyle D_r \left({u}\right) = \sum_{i \mathop = 1}^n s_i^r \ln s_i$ from Derivative of Exponential Function and Sum Rule for Derivatives
$D_r \left({v}\right) = - \dfrac 1 {r^2}$ from Power Rule for Derivatives

In the case where $r=1$, we have:

$D_r \left({u^v}\right) = 0$

When $r > 1$, we have:

 $\displaystyle D_r \left({\left({\sum_{i \mathop = 1}^n s_i^r}\right)^{1/r} }\right)$ $=$ $\displaystyle \dfrac 1 r \left({\sum_{i \mathop = 1}^n s_i^r}\right)^{1/ \left({r - 1}\right) } \left({\sum_{i \mathop = 1}^n s_i^r \ln s_i}\right) - \dfrac {\left({\sum_{i \mathop = 1}^n s_i^r}\right)^{1/r} \ln \left({\sum_{i \mathop = 1}^n s_i^r}\right)} {r^2}$ $\quad$ $\quad$ $\displaystyle$ $=$ $\displaystyle \dfrac {\left({\sum_{i \mathop = 1}^n s_i^r}\right)^{1/r} } r \left({\dfrac {\sum_{i \mathop = 1}^n s_i^r \ln s_i} {\sum_{i \mathop = 1}^n s_i^r} - \dfrac {\ln \left({\sum_{i \mathop = 1}^n s_i^r}\right)} r}\right)$ $\quad$ $\quad$ $\displaystyle$ $=$ $\displaystyle \dfrac {\left({\sum_{i \mathop = 1}^n s_i^r}\right)^{1/r} } r \left({\dfrac {r \left({\sum_{i \mathop = 1}^n s_i^r \ln s_i}\right) - \left({\sum_{i \mathop = 1}^n s_i^r}\right) \ln \left({\sum_{i \mathop = 1}^n s_i^r}\right)} {r \left({\sum_{i \mathop = 1}^n s_i^r}\right)} }\right)$ $\quad$ $\quad$ $\displaystyle$ $=$ $\displaystyle K \left({r \left({\sum_{i \mathop = 1}^n s_i^r \ln s_i}\right) - \left({\sum_{i \mathop = 1}^n s_i^r}\right) \ln \left({\sum_{i \mathop = 1}^n s_i^r}\right)}\right)$ $\quad$ where $\displaystyle K = \dfrac {\left({\sum_{i \mathop = 1}^n s_i^r}\right)^{1/r} } {r^2 \left({\sum_{i \mathop = 1}^n s_i^r}\right)} > 0$ $\quad$ $\displaystyle$ $=$ $\displaystyle K \left({\sum_{i \mathop = 1}^n s_i^r \ln \left({s_i^r}\right) - \left({\sum_{i \mathop = 1}^n s_i^r}\right) \ln \left({\sum_{i \mathop = 1}^n s_i^r}\right)}\right)$ $\quad$ Logarithms of Powers $\quad$ $\displaystyle$ $=$ $\displaystyle K \left({\sum_{j \mathop = 1}^n \left({s_j^r \left({\ln \left({s_j^r}\right) - \ln \left({\sum_{i \mathop = 1}^n s_i^r}\right)}\right)}\right)}\right)$ $\quad$ $\quad$ $\displaystyle$ $=$ $\displaystyle K \left({\sum_{j \mathop = 1}^n \left({s_j^r \ln \left({\frac {s_j^r} {\sum_{i \mathop = 1}^n s_i^r} }\right)}\right)}\right)$ $\quad$ $\quad$

where $K > 0$ because all of $s_i, r > 0$.

For the same reason, $\displaystyle \dfrac{s_j^r} {\sum_{i \mathop = 1}^n s_i^r} < 1$ for all $j \in \left\{ {1, \ldots, n}\right\}$.

From Logarithm of 1 is 0 and Logarithm is Strictly Increasing, their logarithms are therefore negative.

So for $r > 1$:

$\displaystyle D_r \left({\left({\sum_{i \mathop = 1}^n s_i^r}\right)^{1/r}}\right) < 0$

So, from Derivative of Monotone Function, it follows that (given the conditions on $r$ and $s_i$) $\displaystyle \left({\sum_{i \mathop = 1}^n s_i^r}\right)^{1/r}$ is decreasing.

As we assumed $r \le t$, we have $d_r \left({x, y}\right) \ge d_t \left({x, y}\right)$.

$\Box$

When we combine the inequalities, we have:

$d_r \left({x, y}\right) \ge d_\infty \left({x, y}\right) \ge n^{-1} d_1 \left({x,y}\right) \ge n^{-1} d_r \left({x, y}\right)$

Therefore, $d_r$ and $d_\infty$ are Lipschitz equivalent for all $r \in \R_{\ge 1}$.

$\blacksquare$