User:Julius

From ProofWiki
Jump to navigation Jump to search

Current focus

  • So I just noticed that vector notation is being used in Gelfand's for higher dimensional functionals. This implies rewriting all multivariable functionals. Implement this gradually.

Lemmas and theorems for Bernstein's Theorem on Unique Extrema (1978)

Raw material

Weierstrass Approximation Theorem

Let $x : C \closedint a b \to C \closedint 0 1$.

Let $n \in \N$.

For $t \in \closedint 0 1$ define

$\displaystyle \map {\paren {B_n x} } t = \sum_{k \mathop = 0}^n \map x {\frac k n} \binom n k t^k \paren {1 - t}^{n - k}$

For $t \in \closedint 0 1$, $0 \le k \le n$, let:

$\displaystyle \map {p_{n,k}} t := \binom n k t^k \paren {1 - t}^{n - k}$

By binomial theorem:

$\displaystyle \sum_{k \mathop = 0}^n \map {p_{n,k} } t = 1$

Lemma

$\sum_{k \mathop = 0}^n k \map {p_{n,k} } t = nt$

From binomial theorem:

\(\displaystyle \paren {x + y}^n\) \(=\) \(\displaystyle \sum_{k = 0}^n \binom n k y^k x^{n - k}\)
\(\displaystyle \leadsto \ \ \) \(\displaystyle n \paren {x + y}^{n - 1} \paren {\dfrac {\d x}{\d y} + 1}\) \(=\) \(\displaystyle \sum_{k = 0}^n \binom n k \paren {k y^{k - 1} x^{n - k} + y^k \paren {n - k} x^{n - k - 1} \dfrac {\d x}{\d y} }\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k = 0}^n k p_{n,k} \paren{\frac 1 y - \frac 1 x \dfrac {\d x} {\d y} } + n \frac 1 x \dfrac {\d x} {\d y}\sum_{k = 0}^n p_{n,k}\)
\(\displaystyle \leadsto \ \ \) \(\displaystyle 0\) \(=\) \(\displaystyle \sum_{k = 0}^n k p_{n,k} \paren {\frac 1 t + \frac 1 {1 - t} } - \frac n {1 - t }\)
\(\displaystyle \leadsto \ \ \) \(\displaystyle n t\) \(=\) \(\displaystyle \sum_{k = 0}^n k p_{n,k}\)

$\sum_{k \mathop = 0}^n \paren {k - nt}^2 \map {p_{n,k} } t = nt \paren {1 - t}$

\(\displaystyle \sum_{k : \size {\frac k n - t} \ge \delta} \map {p_{n,k} } t\) \(\le\) \(\displaystyle \sum_{k : \size {\frac k n - t} \ge \delta} \map {p_{n,k} } t \frac {\paren {k - nt}^2} {\delta^2 n^2}\)
\(\displaystyle \) \(\le\) \(\displaystyle \frac 1 {n^2 \delta^2} \sum_{k = 0}^n \paren{k - nt}^2 \map {p_{n,k} } t\)
\(\displaystyle \) \(=\) \(\displaystyle \frac {t \paren {1 - t} } {n \delta^2}\)
\(\displaystyle \) \(\le\) \(\displaystyle \frac 1 {4 n \delta^2}\)

$0 \le \paren {\sqrt t - \sqrt{1 - t^2} }^2 = 1 - 2 \sqrt {t \paren {1 - t} } \forall t \in \closedint 0 1$

$\map {\omega_\delta} x := \sup_{\size {t - s} < \delta} \size {\map x s \map x t}$

\(\displaystyle \size {\map {B_n x} t - \map x t}\) \(=\) \(\displaystyle \size {\map {B_n x} t - \map x t \sum_{k = 0}^n \map {p_{n,k} } t}\)
\(\displaystyle \) \(=\) \(\displaystyle \size {\sum_{k = 0}^n \map x {\frac k n} \map {p_{n,k} } t - \map x t \sum_{k = 0}^n \map {p_{n,k} } t}\)
\(\displaystyle \) \(\le\) \(\displaystyle \sum_{k = 0}^n \size {\map x {\frac k n} - \map x t} \map {p_{n,k} } t\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k : \size {\frac k n - t} < \delta}^n \size {\map x {\frac k n} - \map x t} \map {p_{n,k} } t + \sum_{k : \size {\frac k n - t} \ge \delta}^n \size {\map x {\frac k n} - \map x t} \map {p_{n,k} } t\)
\(\displaystyle \) \(\le\) \(\displaystyle \map {\omega_\delta} x \sum_{k : \size {\frac k n - t} < \delta}^n \size {\map x {\frac k n} - \map x t} \map {p_{n,k} } t + 2 \norm {x}_\infty \frac 1 {4 n \delta^2}\)
\(\displaystyle \) \(\le\) \(\displaystyle \map {\omega_\delta} x + \frac {\norm {x}_\infty} {2 n \delta^2}\)

Let $\epsilon > 0$.

$x$ is uniformly continuous, so we choose $\delta >0$ such that $\map {\omega_\delta} x < \frac \epsilon 2$.

Choose $n > \frac {\norm {x}_\infty} {\epsilon \delta^2}$

$\size {\map {B_n} x - x}_\infty < \epsilon$.

Digestion of the following topics is in progress

Example 1

Suppose that:

$J \sqbrk y = \int_1^2 \frac {\sqrt {1+y'^2} } {x} \rd x$

with the following boundary conditions:

$\map y 1 = 0$
$\map y 2 = 1$


Then the smooth minimizer of $J$ is a circle of the following form:

$\paren {y - 2}^2 + x^2 = 5$

Proof

$J$ is of the form

$J \sqbrk y = \int_a^b \map F {x, y'} \rd x$

Then we can use the "no y theorem":

$F_y = C$

i.e.

$\frac {y'} {x \sqrt {1 + y'^2} } = C$

or

$y' = \frac {C x} {\sqrt {1 - C^2 x^2} }$

The integral is equal to

$y = \frac {\sqrt {1 - C^2 x^2} } C + C_1$

or

$\paren {y - C_1}^2 + x^2 = C^{-2}$

From the conditions $\map y 1 = 0$, $\map y 2 = 1$ we find that

$C = \frac 1 {\sqrt 5}$
$C_1 = 2$


$\blacksquare$

Example 3

$J \sqbrk = \int_a^b \paren {x - y}^2$

is minimized by

$\map y x = x$

Proof

Euler' equation:

$F_y = 0$

i.e.

$2 \paren {x - y} = 0$.


$\blacksquare$

Example p31

Suppose:

$J \sqbrk r = \int_{\phi_0}^{\phi_1} \sqrt{r^2 + r'^2} \rd \phi$

Euler's Equation:

$\displaystyle \frac r {\sqrt{r^2 + r'^2} } - \dfrac \d {\d \phi} \frac {r'} {\sqrt{r^2 + r'^2} }$

Apply change of variables:

$x = r \cos \phi, y = r \sin \phi$

The integral becomes:

$\displaystyle \int_{x_0}^{x_1} \sqrt{1 + y'^2} \rd x$

Euler's equation:

$y'' = 0$

Its solution:

$y = \alpha x + \beta$

or

$r \sin \phi = \alpha r \cos \phi + \beta$


$\blacksquare$

Example

$J \sqbrk = \int_{x_0}^{x_1} \map f {x,y} \sqrt {1+y'^2}\rd x$
$F_{y'} = \map f {x,y} \frac {y'} {\sqrt{1 + y'^2} }=\frac {y' F} {1 + y'^2}$
$F + \paren {\phi' - y'}F_{y'} = \frac {\paren{1+y'\phi'}F} {1+y'^2} = 0$
$F + \paren {\psi' - y'}F_{y'} = \frac {\paren{1+y'\psi'}F} {1+y'^2} = 0$

i.e.

$y' = -\frac 1 {\phi'}$
$y' = - \frac 1 {\psi'}$

Transversality reduces to orthogonality


$\blacksquare$

Example: points on surfaces

$J \sqbrk {y,z} = \int_{x_0}^{x_1} \map F {x,y,z,y',z'} \rd x$

Transversality conditions:

$\sqbrk {F_{y'} + \dfrac {\partial \phi} {\partial y} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x0} = 0$
$\sqbrk {F_{z'} + \dfrac {\partial \phi} {\partial z} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x0} = 0$
$\sqbrk {F_{y'} + \dfrac {\partial \phi} {\partial y} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x1} = 0$
$\sqbrk {F_{z'} + \dfrac {\partial \phi} {\partial z} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x1} = 0$


$\blacksquare$

Example: Legendre transformation

$\map f \xi = \frac {\xi^a} a, a>1$
$\map {f'} \xi = p = \xi^{a-1}$

i.e.

$\xi = p^{\frac {1} {a-1} }$
$H = - \frac {\xi^a} {a} + p\xi = - \frac {p^{\frac {a} {a-1} } } a + p p^{\frac {a} {a-1} } = p^{\frac {a} {a-1} } \paren{1 - \frac 1 a}$

Hence:

$\map H p = \frac {p^b} b$

where:

$\frac 1 a + \frac 1 b = 1$


$\blacksquare$

Example

$J \sqbrk y = \int_a^b \paren {Py'^2 + Q y^2} \rd x$
$p = 2 P y', H = P y'^2 - Q y^2$

Hence:

$H = \frac {p^2} {4 P} - Q y^2$

Canonical equations:

$\dfrac {\d p} {\d x} = 2 Q y$
$\dfrac {\d y} {\d x} = \frac p {2 P}$

Euler's Equation:

$2 y Q - \dfrac \d {\d x} \paren {2 P y'} = 0$


$\blacksquare$

Example: Noether's theorem 1

$J \sqbrk y = \int_{x0}^{x1} y'^2 \rd x$

is invariant under the transformation:

$x^* = x + \epsilon, y^* = y$
$y^* = \map y {x^* - \epsilon} = \map {y^*} {x^*}$

Then:

$J \sqbrk {\gamma^*} = \int_{x0^*}^{x1^*} \sqbrk { \dfrac {\d \map {y^*} {x^*} } {\d x^*} } \rd x^* = \int_{x0+\epsilon}^{x_1 + \epsilon} \sqbrk { \dfrac {\d \map y {x^* - \epsilon} } {\d x^*} }^2 \rd x^* = \int_{x0}^{x1} \sqbrk { \dfrac {\d \map y x} {\d x} }^2 \rd x = J \sqbrk \gamma$

Example: Neother's theorem 2

$J \sqbrk y = \int_{x_0}^{x_1} x y'^2 \rd x$
\(\displaystyle J \sqbrk {y^*}\) \(=\) \(\displaystyle \int_{x_0^*}^{x_1^*} x^* \sqbrk {\dfrac {\d \map {y^*} {x^*} } {\d x^*} }^2 \rd x^*\)
\(\displaystyle \) \(=\) \(\displaystyle \int_{x_0 + \epsilon}^{x_1 + \epsilon} x^* \sqbrk {\dfrac {\d \map y {x^* - \epsilon} } {\d x^*} }^2 \rd x^*\)
\(\displaystyle \) \(=\) \(\displaystyle \int_{x_0}^{x_1} \paren {x + \epsilon} \sqbrk {\dfrac {\d \map y x} {\d x} }^2 \rd x\)
\(\displaystyle \) \(=\) \(\displaystyle J \sqbrk \gamma + \epsilon \int_{x_0}^{x_1} \sqbrk {\dfrac {\d \map y x} {\d x} }^2 \rd x\)
\(\displaystyle \) \(\ne\) \(\displaystyle J \sqbrk \gamma\)


$\blacksquare$

Example: Noether's theorem 3

$J \sqbrk y = \int_{x_0}^{x_1} \map F {y, y'} \rd x$

Invariant under $x^* = x + \epsilon, y_i^* = y_i$

I.e. $\phi = 1, \psi_i = 0$

reduces to $H = \const$


$\blacksquare$

Momentum of the system:

$P_x = \sum_{y = 1}^n p_{ix}, P_y = \sum_{y = 1}^n p_{iy}, P_z = \sum_{z = 1}^n p_{iz}$

(Examples: attraction to a fixed point, attraction to a homogenous distribution on an axis)

Geodetic distance:Examples

If $J$ is arclength, $S$ is distance.

If $J$ is a moment of time to pass a segment of optical medium, then $S$ is the time needed to pass the whole optical body.

If $J$ is action, then $S$ is the minimal action.

Examples of quadratic functionals

1) $B \sqbrk {x, y} = \int_{t_0}^{t_1} \map x t \map y t \rd t$

Corresponding quadratic functional

$A \sqbrk x = \int_{t_0}^{t_1} \map {x^2} t$

2) $B \sqbrk {x, y} = \int_{t_0}^{t_1} \map \alpha t \map x t \map y t \rd t$

Corresponding quadratic functional

$A \sqbrk x = \int_{t_0}^{t_1} \map \alpha t \map {x^2} t \rd t$

3)

$A \sqbrk x = \int_{t_0}^{t_1} \paren {\map \alpha t \map {x^2} t + \map \beta t \map x t \map {x'} t+ \map \gamma t \map {x'^2} t} \rd t$

4)

$B \sqbrk {x, y} = \int_a^b \int_a^b \map K {s, t} \map x s \map y t \rd s \rd t$


Second variations of simple functions

Strong minimmizers' examples

Vibrating string

Vibrating membrane

Vibrating plate

Klein-Gordon field

Multidimensional laws of conservation

Angular momentum tensor

Examples of conservatino for KG, Maxwel EM

Introduction to optimal control

Functional Analysis

$\paren{C \closedint a b,\norm{\cdot}_\infty }$ is a Banach space.

Let $\sequence{x_n}_{n \in \N}$ be a Cauchy sequence.

$\forall \epsilon \in \R_{> 0} : \exists N \in \N : \forall n, m > N : \norm{x_n − x_m}_\infty < \epsilon$

Suppose, all the elements of $\sequence{x_n}_{n \in \N}$ are additionally indexed with $t$:

$\sequence{x_n}_{n \in \N} = \sequence{\map {x_n} t }_{n \in \N}$

Let $t \in \closedint a b$.

But

$\displaystyle \forall n, m > N : \norm {\map {x_n} t - \map {x_m} t}_\infty < \max_{\tau \in \closedint a b}\norm {\map {x_n} \tau - \map {x_m} \tau}_\infty = \norm {x_n - x_m}_\infty < \epsilon$

Hence, $\sequence{\map {x_n} t}_{n \in \N}$ is a Cauchy sequence in $\R$.

$\R$ is complete.

Therefore, $\sequence{\map {x_n} t}_{n \in \N}$ is convergent with limit $L = \map L t$.

Choose $N$ such that $\forall n,m > N : \norm{x_n - x_m} \le \frac \epsilon 3$

Let $\tau \in \closedint a b$.

Then $\forall n > N : \norm {\map {x_n} \tau - \map {x_{N + 1}} \tau } \le \norm {x_n - x_{N + 1} }_\infty \le \frac \epsilon 3$

Take the limit $n \to \infty$:

$\lim_{n \to \infty} \norm {\map {x_n} \tau - \map {x_{N + 1}} \tau } = \norm {\map x \tau - \map {x_{N + 1}} \tau } \le \frac \epsilon 3$

which holds for all $\tau \in \closedint a b$.

Now $\map {x_{N+1} } \tau \in C \closedint a b$

$\exists \delta > 0: \norm {\tau - t} < \delta \implies \norm {\map {x_{N+1} } t - \map {x_{N+1} } \tau} \le \frac \epsilon 3$

Thus:

$\norm {\map x \tau - \map x t} = \norm {\map x \tau - \map {x_{N+1}} \tau + \map {x_{N+1}} \tau - \map {x_{N+1}} t + \map {x_{N+1}} t - \map x t} \le$
$\norm {\map x \tau - \map {x_{N+1}} \tau} + \norm {\map {x_{N+1}} \tau - \map {x_{N+1}} t} + \norm {\map {x_{N+1}} t - \map x t} \le \frac \epsilon 3 + \frac \epsilon 3 + \frac \epsilon 3 = \epsilon$

Hence $x$ is continuous at $t$.

Since $t \in C \closedint a b$, $t$ is continuous in whole interval.

Finally, show that $\sequence {x_n}_{n \in \N}$ converges to $x$.

Let $\epsilon > 0$.

Choose $N$ such that $\forall n,m > N : \norm{x_n - x_m}_\infty < \epsilon$

Fix $n > N$.

Let $t \in \closedint a, b$.

Then $\forall m > N: \norm {\map {x_n} t - \map {x_m} t} \le \norm {x_n - x_m}_\infty < \epsilon$

Thus $\norm{\map {x_n} t - \map x t} = \lim_{n \to \infty} \norm {\map {x_n} t - \map {x_m} t} \le \epsilon$

Since $t$ was arbitrary: $\norm {x_n - x}_\infty = \max_{t \in \closedint a b } \norm{\map {x_n} t - \map x t} \le \epsilon$

This could also have been achieved by fixing $n > N$.

So, $\forall n > N \norm {x_n - x}_\infty \le \epsilon$.

Therefore $\lim_{x \to \infty} x_n = x$ in $C \closedint a b$