# User:Julius

## Current focus

• Build the bulk knowledge on calculus of variations based on Gelfand's Calculus of Variations, then recheck with a couple of other books and slowly improve proofs.

## Theorem

Let $V$ be a complex vector space.

Let $\map L {V, \C}$ be the space of all linear transformations from $V$ to $\C$.

Let $\ell, L \in \map L {V, \C}$ be such that $\ker \ell \subseteq \ker L$ where $\ker$ is the kernel.

Then:

$\exists c \in \C : L = c \ell$

## Proof

Suppose $\ell = \mathbf 0$.

Then:

$\paren {V = \ker \ell \subseteq \ker L} \implies \ker L = V$

Therefore $L = 0$, and we can set $c = 0$ to have:

$L = \mathbf 0 = 0 \cdot \ell$

Suppose $\ell \ne \mathbf 0$.

Then:

$\exists v_0 \in V : v_0 \ne \mathbf 0 : \map \ell {v_0} \ne 0$

Let $v \in V$.

Let:

$w = v - c_v v_0$

where $c_v \in \R$.

Then:

$\map \ell w = \map \ell v - c_v \map \ell {v_0}$

Then:

$\map \ell w = \map \ell v - \frac {\map \ell v}{\map \ell {v_0} } \map \ell {v_0} = 0$
$w = v - \frac {\map \ell v}{\map \ell {v_0} } v_0$

Hence:

$w \in \ker \ell$

Since $w \in \ker \ell \subseteq \ker L$, we have that $\map L w = 0$.

Furthermore:

 $\ds \map L v$ $=$ $\ds \map L {c_v v_0 + w}$ $\ds$ $=$ $\ds c_v \map L {v_0} + \map L w$

## Theorem

Let $T$ be a distributional solution to the following differential equation:

$\paren {\dfrac \d {\d x} - \lambda} T_f = \mathbf 0$

Then $f$ is the classical solution to

$\paren {\dfrac \d {\d x} - \lambda} f = 0$

## Proof

 $\ds \paren {\map \exp {- \lambda x} T}'$ $=$ $\ds - \map \exp {- \lambda x} T + \map \exp {-\lambda x} T'$ $\ds$ $=$ $\ds \map \exp {-\lambda x} \paren {-\lambda T + T'}$ $\ds$ $=$ $\ds \map \exp {-\lambda x} \paren {\dfrac \d {\d x} - \lambda} T$

Finish with theorem involving kernels.

## Theorem

Let $f \in \map {\CC^\infty} \R$ be a smooth real function.

Let $T \in \map {\DD'} \R$ be a distribution.

Suppose:

$\paren {\dfrac \d {\d x} - \lambda} T = f (*)$

Then $T$ is equal to the classical solution of (*) and:

$T = F + c \map \exp {\lambda x}$

where $F \in \map {\CC^\infty} \R$ is a classical solution of (*).

## Proof

By assumption $f \in \map {\CC^\infty} \R$.

Therefore:

$\exists F \in \map {\CC^\infty} \R : \paren {\dfrac \d {\d x} - \lambda} F = f$

Hence:

$\paren {\dfrac \d {\d x} - \lambda} \paren {T - F} = f - f = 0$

From part (1):

$T - F = c \map \exp {\lambda x}$

Rearranging terms yields the desired result.

$\blacksquare$

## Theorem

Let $D$ be an ordinary differential operator with constant coefficients:

$\ds D = \sum_{k \mathop = 0}^n a_k \paren {\dfrac \d {\d x}}^k$

Let $f \in \map {\CC^\infty} \R$ be a smooth real function.

Let $T \in \map {\DD'} \R$ be a distribution.

Suppose $T$ is a distributional solution to $D T = f$.

Then $T$ is a classical solution and $T \in \map {\DD'} \R$

## Proof

Let $\map P \xi$ be a polynomial such that:

$\ds \map P \xi = \sum_{k \mathop = 0}^n a_k \xi^k = a_n \prod_{k \mathop = 0}^n \paren {\xi - \lambda_k}$

Then there exists a polynomial $\map Q \xi$ such that:

$\map P \xi = \paren {\xi - \lambda_n} \map Q \lambda$

Let:

$\ds D = \sum_{k \mathop = 0}^n a_k \paren {\dfrac \d {\d x}}^k$

Then:

$D = \map P {\dfrac \d {\d x} }$

Furthermore:

$D = \paren {\dfrac \d {\d x} - \lambda_n} \map {Q_1} {\dfrac \d {\d x} }$

where:

$D_1 := \map Q {\dfrac \d {\d x} }$

Now we will show that:

$\paren {DT = f, f \in \map {\CC^\infty} \R} \implies \paren {T = T_F, F \in \map {\CC^\infty} \R}$

Let $n \in \N$ be the order of $D$.

Let $n = 1$

By part (2), if for some $f \in \map {\CC^\infty} \R$ we have that:

$D_1 T = f$

Then there exists $F \in \map {\CC^\infty} \R$ such that:

$T = T_F$

## Theorem

Let $E_* \in \map {\DD'} \R$ be a distribution.

Let $D$ be an ordinary differential operator with constant coefficients.

Let $\delta$ be the Dirac delta distribution.

Let $E_*$ be the fundamental solution to $D E_* = \delta$

Then the set of all fundamental solutions $E_*$ is ?

## Theorem(Derivative Operator on Continuously Differentiable Function Space with Supremum Norm is not Continuous)

Let $I = \closedint 0 1$ be a closed real interval.

Let $\map \CC I$ be the real-valued, continuous on $I$ function space.

Let $\map {\CC^1} I$ be the continuously differentiable function space.

Let $x \in \map {\CC^1} I$ be a continuoulsly differentiable real-valued function.

Let $D : \map {\CC^1} I \to \map \CC I$ be the derivative operator such that:

$\forall t \in \closedint 0 1 : \map {Dx} t := \map {x'} t$

Suppose $\map \CC I$ and $\map {\CC^1} I$ are equipped with the supremum norm.

Then $D$ is not continuous.

## Proof

Aiming for a contradiction, suppose $D$ is continuous.

By definition:

$\exists M \in \R_{> 0} : \forall x \in \map {\CC^1} I : \norm {\map D x}_\infty \le M \norm x_\infty$

Suppose $x = t^n$ with $n \in \N$.

Then:

$\norm {x}_\infty = \norm {t^n}_\infty = 1$
$\norm {x'}_\infty = \norm {n t^{n-1}}_\infty = n$

Hence:

 $\ds \norm{Dx}_\infty$ $=$ $\ds \norm{x'}_\infty$ $\ds$ $=$ $\ds n$ $\ds$ $\le$ $\ds M \norm {x}_\infty$ $\ds$ $=$ $\ds M \cdot 1$

In other words:

$\forall n \in \N : n \le M$

But $M$ is finite.

Hence, $D$ is not continuous.

$\blacksquare$

## Theorem(Continuity of derivative operator)

$x \in \CC^1 \sqbrk {0, 1}$
$D : \CC^1 \sqbrk {0, 1} \to \CC \sqbrk {0, 1}$
$\map {Dx} t := \map {x'} t, t \in \sqbrk {0, 1}$

$D$ not continuous if equipped with $\norm {\, \cdot \,}_\infty$

Let $x = t^n, n \in \N$

$\norm {x}_\infty = \norm {t^n}_\infty = 1$
$\norm {x'}_\infty = \norm {n t^{n-1}}_\infty = n$
$\norm{Dx}_\infty = \norm{x'}_\infty = n \le M \norm {x}_\infty = M \cdot 1$

Hence, $D$ is not continuous.

$\norm {Dx}_\infty = \norm {x'}_\infty \le \norm {x}_\infty + \norm {x'}_\infty = \norm {x}_{1, \infty}$

## Example 1

Suppose that:

$J \sqbrk y = \int_1^2 \frac {\sqrt {1+y'^2} } {x} \rd x$

with the following boundary conditions:

$\map y 1 = 0$
$\map y 2 = 1$

Then the smooth minimizer of $J$ is a circle of the following form:

$\paren {y - 2}^2 + x^2 = 5$

### Proof

$J$ is of the form

$J \sqbrk y = \int_a^b \map F {x, y'} \rd x$

Then we can use the "no y theorem":

$F_y = C$

i.e.

$\frac {y'} {x \sqrt {1 + y'^2} } = C$

or

$y' = \frac {C x} {\sqrt {1 - C^2 x^2} }$

The integral is equal to

$y = \frac {\sqrt {1 - C^2 x^2} } C + C_1$

or

$\paren {y - C_1}^2 + x^2 = C^{-2}$

From the conditions $\map y 1 = 0$, $\map y 2 = 1$ we find that

$C = \frac 1 {\sqrt 5}$
$C_1 = 2$

$\blacksquare$

## Example 3

$J \sqbrk = \int_a^b \paren {x - y}^2$

is minimized by

$\map y x = x$

### Proof

Euler' equation:

$F_y = 0$

i.e.

$2 \paren {x - y} = 0$.

$\blacksquare$

## Example p31

Suppose:

$J \sqbrk r = \int_{\phi_0}^{\phi_1} \sqrt{r^2 + r'^2} \rd \phi$

Euler's Equation:

$\displaystyle \frac r {\sqrt{r^2 + r'^2} } - \dfrac \d {\d \phi} \frac {r'} {\sqrt{r^2 + r'^2} }$

Apply change of variables:

$x = r \cos \phi, y = r \sin \phi$

The integral becomes:

$\displaystyle \int_{x_0}^{x_1} \sqrt{1 + y'^2} \rd x$

Euler's equation:

$y'' = 0$

Its solution:

$y = \alpha x + \beta$

or

$r \sin \phi = \alpha r \cos \phi + \beta$

$\blacksquare$

## Example

$J \sqbrk = \int_{x_0}^{x_1} \map f {x,y} \sqrt {1+y'^2}\rd x$
$F_{y'} = \map f {x,y} \frac {y'} {\sqrt{1 + y'^2} }=\frac {y' F} {1 + y'^2}$
$F + \paren {\phi' - y'}F_{y'} = \frac {\paren{1+y'\phi'}F} {1+y'^2} = 0$
$F + \paren {\psi' - y'}F_{y'} = \frac {\paren{1+y'\psi'}F} {1+y'^2} = 0$

i.e.

$y' = -\frac 1 {\phi'}$
$y' = - \frac 1 {\psi'}$

Transversality reduces to orthogonality

$\blacksquare$

## Example: points on surfaces

$J \sqbrk {y,z} = \int_{x_0}^{x_1} \map F {x,y,z,y',z'} \rd x$

Transversality conditions:

$\sqbrk {F_{y'} + \dfrac {\partial \phi} {\partial y} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x0} = 0$
$\sqbrk {F_{z'} + \dfrac {\partial \phi} {\partial z} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x0} = 0$
$\sqbrk {F_{y'} + \dfrac {\partial \phi} {\partial y} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x1} = 0$
$\sqbrk {F_{z'} + \dfrac {\partial \phi} {\partial z} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x1} = 0$

$\blacksquare$

## Example: Legendre transformation

$\map f \xi = \frac {\xi^a} a, a>1$
$\map {f'} \xi = p = \xi^{a-1}$

i.e.

$\xi = p^{\frac {1} {a-1} }$
$H = - \frac {\xi^a} {a} + p\xi = - \frac {p^{\frac {a} {a-1} } } a + p p^{\frac {a} {a-1} } = p^{\frac {a} {a-1} } \paren{1 - \frac 1 a}$

Hence:

$\map H p = \frac {p^b} b$

where:

$\frac 1 a + \frac 1 b = 1$

$\blacksquare$

## Example

$J \sqbrk y = \int_a^b \paren {Py'^2 + Q y^2} \rd x$
$p = 2 P y', H = P y'^2 - Q y^2$

Hence:

$H = \frac {p^2} {4 P} - Q y^2$

Canonical equations:

$\dfrac {\d p} {\d x} = 2 Q y$
$\dfrac {\d y} {\d x} = \frac p {2 P}$

Euler's Equation:

$2 y Q - \dfrac \d {\d x} \paren {2 P y'} = 0$

$\blacksquare$

## Example: Noether's theorem 1

$J \sqbrk y = \int_{x0}^{x1} y'^2 \rd x$

is invariant under the transformation:

$x^* = x + \epsilon, y^* = y$
$y^* = \map y {x^* - \epsilon} = \map {y^*} {x^*}$

Then:

$J \sqbrk {\gamma^*} = \int_{x0^*}^{x1^*} \sqbrk { \dfrac {\d \map {y^*} {x^*} } {\d x^*} } \rd x^* = \int_{x0+\epsilon}^{x_1 + \epsilon} \sqbrk { \dfrac {\d \map y {x^* - \epsilon} } {\d x^*} }^2 \rd x^* = \int_{x0}^{x1} \sqbrk { \dfrac {\d \map y x} {\d x} }^2 \rd x = J \sqbrk \gamma$

## Example: Neother's theorem 2

$J \sqbrk y = \int_{x_0}^{x_1} x y'^2 \rd x$
 $\ds J \sqbrk {y^*}$ $=$ $\ds \int_{x_0^*}^{x_1^*} x^* \sqbrk {\dfrac {\d \map {y^*} {x^*} } {\d x^*} }^2 \rd x^*$ $\ds$ $=$ $\ds \int_{x_0 + \epsilon}^{x_1 + \epsilon} x^* \sqbrk {\dfrac {\d \map y {x^* - \epsilon} } {\d x^*} }^2 \rd x^*$ $\ds$ $=$ $\ds \int_{x_0}^{x_1} \paren {x + \epsilon} \sqbrk {\dfrac {\d \map y x} {\d x} }^2 \rd x$ $\ds$ $=$ $\ds J \sqbrk \gamma + \epsilon \int_{x_0}^{x_1} \sqbrk {\dfrac {\d \map y x} {\d x} }^2 \rd x$ $\ds$ $\ne$ $\ds J \sqbrk \gamma$

$\blacksquare$

## Example: Noether's theorem 3

$J \sqbrk y = \int_{x_0}^{x_1} \map F {y, y'} \rd x$

Invariant under $x^* = x + \epsilon, y_i^* = y_i$

I.e. $\phi = 1, \psi_i = 0$

reduces to $H = \const$

$\blacksquare$

### Momentum of the system:

$P_x = \sum_{y = 1}^n p_{ix}, P_y = \sum_{y = 1}^n p_{iy}, P_z = \sum_{z = 1}^n p_{iz}$

(Examples: attraction to a fixed point, attraction to a homogenous distribution on an axis)

## Geodetic distance:Examples

If $J$ is arclength, $S$ is distance.

If $J$ is a moment of time to pass a segment of optical medium, then $S$ is the time needed to pass the whole optical body.

If $J$ is action, then $S$ is the minimal action.

1) $B \sqbrk {x, y} = \int_{t_0}^{t_1} \map x t \map y t \rd t$

$A \sqbrk x = \int_{t_0}^{t_1} \map {x^2} t$

2) $B \sqbrk {x, y} = \int_{t_0}^{t_1} \map \alpha t \map x t \map y t \rd t$

$A \sqbrk x = \int_{t_0}^{t_1} \map \alpha t \map {x^2} t \rd t$

3)

$A \sqbrk x = \int_{t_0}^{t_1} \paren {\map \alpha t \map {x^2} t + \map \beta t \map x t \map {x'} t+ \map \gamma t \map {x'^2} t} \rd t$

4)

$B \sqbrk {x, y} = \int_a^b \int_a^b \map K {s, t} \map x s \map y t \rd s \rd t$