User:Julius

Current focus

 * Build the bulk knowledge on calculus of variations based on Gelfand's Calculus of Variations, then recheck with a couple of other books and slowly improve proofs.

Theorem (Functions in Vector Space of Real-Valued Functions Continuously Differentiable on Closed Interval vanish at Endpoints)
Let $I := \closedint a b$ be a closed real interval.

Let $\struct {\map {\CC^1} I, +, \, \cdot \,}_\R$ be the continuously differentiable on closed interval real function vector space.

Let $S := \set {x \in \map {\CC^1} I : \map x a = y_a, \map x b = y_b}$.

Then $S$ is a vector subspace of $\struct {\map {\CC^1} I, +, \, \cdot \,}_\R$ iff $y_a = y_b = 0$.

Necessary Condition
Suppose $y_a = y_b = 0$.

Closure under Vector Addition
Let $x_1, x_2 \in \map {\CC^1} I$.

By sum rule for derivatives, $x_1 + x_2 \in \map {\CC^1} I$

Evaluate the sum at both endpoints:

Hence, if $x_1, x_2 \in S$ then $x_1 + x_2 \in S$.

Closure under Scalar Multiplication
Let $x \in \map {\CC^1} I$ and $\alpha \in \R$.

By derivative of constant multiple, $\alpha \cdot x \in \map {\CC^1} I$.

Evaluation at both endpoint yields:

Hence, if $x \in S$ and $\alpha \in \R$, then $\alpha \cdot x \in S$.

Nonemptiness
Let $\map 0 x \in \map {\CC^1} I$ be such that:


 * $\map 0 x : I \to 0$.

Then:

Hence, $S$ is a subspace of $\struct {\map {\CC^1} I, +, \, \cdot \,}_\R$

Sufficient Condition
Suppose $S$ is a subspace of $\struct {\map {\CC^1} I, +, \, \cdot \,}_\R$.

Let $x \in S$.

Then $2 \cdot x \in S$ and $\map {\paren {2 \cdot x} } a = y_a$.

However, by Pointwise Scalar Multiplication of Real-Valued Functions we have that:


 * $\map {\paren {2 \cdot x} } a = 2 \map x a = 2 y_a$

Hence, $2 y_a = y_a$, or $y_a = 0$.

Analogously, $y_b = 0$.

Monomials are linearly independent
Let $d \in \N_{\mathop > 0}$.

Consider the set of real monomials of the following form:


 * $\map {x_n} t = t^n$

where $n \in \N_{\mathop > 0}$ and $n \le d$.

the set of $x_n$ is not linearly independent.

Then:


 * $\forall n \in \N_{\mathop > 0} : n \le d : \exists \alpha_n \in \R : \neg \forall n : \alpha_n \ne 0$

and:


 * $\ds \sum_{k \mathop = 1}^d \alpha_k t^k = 0$

Let $m \in \N_{\mathop > 0} : m \le d$ be the smallest index such that $\alpha_m \ne 0$.

Then:


 * $\ds \forall t \in \closedint 0 1 : \sum_{k \mathop = m}^d \alpha_k t^k = 0$

or


 * $\ds \forall t \in \hointl 0 1 : \sum_{k \mathop = m}^d \alpha_k t^{k - d} = 0$

Note that:


 * $\ds \forall n \in \N_{\mathop > 0} : \exists t \in \hointl 0 1 : t = \frac 1 n$

Thus:


 * $\ds \forall n \in \N_{\mathop > 0} : \sum_{k \mathop = m}^d \frac {\alpha_k} {n^{d - k}} = 0$

Passing the limit $n \to \infty$ gives us $\alpha_d = 0$.

This is a contradiction.

Hence, the set of $x_n$ is linearly independent.

$\struct {\map \CC I, +, \, \cdot \,}_\R$ is not finite dimensional
$\struct {\map \CC I, +, \, \cdot \,}_\R$ is finite dimensional and has the dimension $d$.

Any independent set of cardinality $d$ in a $d$-dimensional vector space is a basis for this vector space.

Then the set of monomials $x_n$ with $n \in \N_{\mathcal > 0}$ and $n \le d$ is a basis for $\struct {\map \CC I, +, \, \cdot \,}_\R$.

The constant function $\map x t = 1$ belongs to $\map \CC I$.

Then:


 * $\ds \forall n \in \N_{\mathop > 0} : n \le d : \exists \beta_n \in \R : 1 = \sum_{k \mathop = 1}^d \beta_k \map {x_k} t$

Let $t = 0$.

Then $1 = 0$.

This is a contradiction.

Hence, $\struct {\map \CC I, +, \, \cdot \,}_\R$ is not finite dimensional.

Lemmas and theorems for Bernstein's Theorem on Unique Extrema (1978)
Raw material

Example 1
Suppose that:


 * $J \sqbrk y = \int_1^2 \frac {\sqrt {1+y'^2} } {x} \rd x$

with the following boundary conditions:


 * $\map y 1 = 0$


 * $\map y 2 = 1$

Then the smooth minimizer of $J$ is a circle of the following form:


 * $\paren {y - 2}^2 + x^2 = 5$

Proof
$J$ is of the form


 * $J \sqbrk y = \int_a^b \map F {x, y'} \rd x$

Then we can use the "no y theorem":


 * $F_y = C$

i.e.


 * $\frac {y'} {x \sqrt {1 + y'^2} } = C$

or


 * $y' = \frac {C x} {\sqrt {1 - C^2 x^2} }$

The integral is equal to


 * $y = \frac {\sqrt {1 - C^2 x^2} } C + C_1$

or


 * $\paren {y - C_1}^2 + x^2 = C^{-2}$

From the conditions $\map y 1 = 0$, $\map y 2 = 1$ we find that


 * $C = \frac 1 {\sqrt 5}$


 * $C_1 = 2$

Example 3

 * $J \sqbrk = \int_a^b \paren {x - y}^2$

is minimized by


 * $\map y x = x$

Proof
Euler' equation:


 * $F_y = 0$

i.e.


 * $2 \paren {x - y} = 0$.

Example p31
Suppose:


 * $J \sqbrk r = \int_{\phi_0}^{\phi_1} \sqrt{r^2 + r'^2} \rd \phi$

Euler's Equation:


 * $\displaystyle \frac r {\sqrt{r^2 + r'^2} } - \dfrac \d {\d \phi} \frac {r'} {\sqrt{r^2 + r'^2} }$

Apply change of variables:


 * $x = r \cos \phi, y = r \sin \phi$

The integral becomes:


 * $\displaystyle \int_{x_0}^{x_1} \sqrt{1 + y'^2} \rd x$

Euler's equation:


 * $y'' = 0$

Its solution:


 * $y = \alpha x + \beta$

or


 * $r \sin \phi = \alpha r \cos \phi + \beta$

Example

 * $J \sqbrk = \int_{x_0}^{x_1} \map f {x,y} \sqrt {1+y'^2}\rd x$


 * $F_{y'} = \map f {x,y} \frac {y'} {\sqrt{1 + y'^2} }=\frac {y' F} {1 + y'^2}$


 * $F + \paren {\phi' - y'}F_{y'} = \frac {\paren{1+y'\phi'}F} {1+y'^2} = 0$


 * $F + \paren {\psi' - y'}F_{y'} = \frac {\paren{1+y'\psi'}F} {1+y'^2} = 0$

i.e.


 * $y' = -\frac 1 {\phi'}$


 * $y' = - \frac 1 {\psi'}$

Transversality reduces to orthogonality

Example: points on surfaces

 * $J \sqbrk {y,z} = \int_{x_0}^{x_1} \map F {x,y,z,y',z'} \rd x$

Transversality conditions:


 * $\sqbrk {F_{y'} + \dfrac {\partial \phi} {\partial y} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x0} = 0$


 * $\sqbrk {F_{z'} + \dfrac {\partial \phi} {\partial z} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x0} = 0$


 * $\sqbrk {F_{y'} + \dfrac {\partial \phi} {\partial y} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x1} = 0$


 * $\sqbrk {F_{z'} + \dfrac {\partial \phi} {\partial z} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x1} = 0$

Example: Legendre transformation

 * $\map f \xi = \frac {\xi^a} a, a>1$


 * $\map {f'} \xi = p = \xi^{a-1}$

i.e.


 * $\xi = p^{\frac {1} {a-1} }$


 * $H = - \frac {\xi^a} {a} + p\xi = - \frac {p^{\frac {a} {a-1} } } a + p p^{\frac {a} {a-1} } = p^{\frac {a} {a-1} } \paren{1 - \frac 1 a}$

Hence:


 * $\map H p = \frac {p^b} b$

where:


 * $\frac 1 a + \frac 1 b = 1$

Example

 * $J \sqbrk y = \int_a^b \paren {Py'^2 + Q y^2} \rd x$


 * $p = 2 P y', H = P y'^2 - Q y^2$

Hence:


 * $H = \frac {p^2} {4 P} - Q y^2$

Canonical equations:


 * $\dfrac {\d p} {\d x} = 2 Q y$


 * $\dfrac {\d y} {\d x} = \frac p {2 P}$

Euler's Equation:


 * $2 y Q - \dfrac \d {\d x} \paren {2 P y'} = 0$

Example: Noether's theorem 1

 * $J \sqbrk y = \int_{x0}^{x1} y'^2 \rd x$

is invariant under the transformation:


 * $x^* = x + \epsilon, y^* = y$


 * $y^* = \map y {x^* - \epsilon} = \map {y^*} {x^*}$

Then:


 * $J \sqbrk {\gamma^*} = \int_{x0^*}^{x1^*} \sqbrk { \dfrac {\d \map {y^*} {x^*} } {\d x^*} } \rd x^* = \int_{x0+\epsilon}^{x_1 + \epsilon} \sqbrk { \dfrac {\d \map y {x^* - \epsilon} } {\d x^*} }^2 \rd x^* = \int_{x0}^{x1} \sqbrk { \dfrac {\d \map y x} {\d x} }^2 \rd x = J \sqbrk \gamma$

Example: Neother's theorem 2

 * $J \sqbrk y = \int_{x_0}^{x_1} x y'^2 \rd x$

Example: Noether's theorem 3

 * $J \sqbrk y = \int_{x_0}^{x_1} \map F {y, y'} \rd x$

Invariant under $x^* = x + \epsilon, y_i^* = y_i$

I.e. $\phi = 1, \psi_i = 0$

reduces to $H = \const$

Momentum of the system:

 * $P_x = \sum_{y = 1}^n p_{ix}, P_y = \sum_{y = 1}^n p_{iy}, P_z = \sum_{z = 1}^n p_{iz}$

(Examples: attraction to a fixed point, attraction to a homogenous distribution on an axis)

Geodetic distance:Examples
If $J$ is arclength, $S$ is distance.

If $J$ is a moment of time to pass a segment of optical medium, then $S$ is the time needed to pass the whole optical body.

If $J$ is action, then $S$ is the minimal action.

Examples of quadratic functionals
1) $B \sqbrk {x, y} = \int_{t_0}^{t_1} \map x t \map y t \rd t$

Corresponding quadratic functional

$A \sqbrk x = \int_{t_0}^{t_1} \map {x^2} t$

2) $B \sqbrk {x, y} = \int_{t_0}^{t_1} \map \alpha t \map x t \map y t \rd t$

Corresponding quadratic functional

$A \sqbrk x = \int_{t_0}^{t_1} \map \alpha t \map {x^2} t \rd t$

3)

$A \sqbrk x = \int_{t_0}^{t_1} \paren {\map \alpha t \map {x^2} t + \map \beta t \map x t \map {x'} t+ \map \gamma t \map {x'^2} t} \rd t$

4)

$B \sqbrk {x, y} = \int_a^b \int_a^b \map K {s, t} \map x s \map y t \rd s \rd t$