User:Julius

Current focus

 * Build the bulk knowledge on calculus of variations based on Gelfand's Calculus of Variations, then recheck with a couple of other books and slowly improve proofs.

Theorem
Let $t, a, b \in \R$.

Then $e^{a t} \map \cos {b t}$ and $e^{b t} \map \sin {b t}$ are linearly independent.

Proof

 * $\map D {f_1} = a f_1 - b f_2$


 * $\map D {f_2} = b f_1 + a f_2$

Let $B = \tuple {f_1, f_2}$.

Then:


 * $\mathbf D \begin{bmatrix} \alpha_1 \\ \alpha_2 \end{bmatrix} = \map D {\alpha_1 f_1 + \alpha_2 f_2} = \alpha_1 \paren {a \map \exp {a x} \map \cos {b x} - b \map \exp {a x} \map \sin {b x} } + \alpha_2 \paren {a \map \exp {a x} \map \sin {b x} + b \map \exp {a x} \map \cos {b x} } = \alpha_1 \paren {a f_1 - b f_2} + \alpha_2 \paren {a f_2 + b f_1} = \begin{bmatrix} a \alpha_1 + b \alpha_2 \\ -b \alpha_1 + a \alpha_2 \end{bmatrix} = \begin{bmatrix} a & b \\ -b & a \end{bmatrix} \begin{bmatrix} \alpha_1 \\ \alpha_2 \end{bmatrix}

$

Proof

 * $\map \det {\mathbf D_B} = a^2 + b^2$

$\mathbf D_B$ is invertible.


 * $\ds \mathbf D_B^{-1} = \frac 1 {a^2 + b^2} \begin{bmatrix}

a & -b \\ b & a \end{bmatrix}$

Proof(Primitive of Exponential of a x by Sine of b x)

 * $\ds \mathbf D_B^{-1} \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \frac 1 {a^2 + b^2} \begin{bmatrix}

a & -b \\ b & a \end{bmatrix} \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \frac 1 {a^2 + b^2} \begin{bmatrix} a \\ b \end{bmatrix}$

Hence:


 * $\ds \map {D^{-1}} {f_1} = \map {D^{-1}} {\map \exp {a x} \map \cos {b x}} = \frac {a f_1 + b f_2} {a^2 + b^2} = \frac {a \map \exp {a x} \map \cos {b x} + b \map \exp {a x} \map \sin {b x}} {a^2 + b^2}$

By definition of $D$:


 * $\ds \dfrac \d {\d x} \paren {\frac {a \map \exp {a x} \map \cos {b t} + b \map \exp {a x} \map \sin {b x}} {a^2 + b^2}} = \map \exp {a x} \map \cos {b x}$

Hence:


 * $\ds \int \map \exp {a x} \map \cos {b x} \rd x = \frac {a \map \exp {a x} \map \cos {b x} + b \map \exp {a x} \map \sin {b x}} {a^2 + b^2} + C$

Proof(Primitive of Exponential of a x by Cosine of b x)

 * $\ds \mathbf D_B^{-1} \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \frac 1 {a^2 + b^2} \begin{bmatrix}

a & -b \\ b & a \end{bmatrix} \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \frac 1 {a^2 + b^2} \begin{bmatrix} a \\ b \end{bmatrix}$

Hence:


 * $\ds \map {D^{-1}} {f_1} = \map {D^{-1}} {\map \exp {a x} \map \cos {b x}} = \frac {a f_1 + b f_2} {a^2 + b^2} = \frac {a \map \exp {a x} \map \cos {b x} + b \map \exp {a x} \map \sin {b x}} {a^2 + b^2}$

By definition of $D$:


 * $\ds \dfrac \d {\d x} \paren {\frac {a \map \exp {a x} \map \cos {b x} + b \map \exp {a x} \map \sin {b x}} {a^2 + b^2}} = \map \exp {a x} \map \cos {b x}$

Hence:


 * $\ds \int \map \exp {a x} \map \cos {b x} \rd x = \frac {a \map \exp {a x} \map \cos {b x} + b \map \exp {a x} \map \sin {b x}} {a^2 + b^2} + C$

Proof
Let $a, b, x \in \R$ be real numbers.

Suppose $a \ne 0 \ne b$.

Denote $\ds f_1 = \map \exp {a x} \map \cos {b x}$, $f_2 = \map \exp {a x} \map \sin {b x}$.

Let $\map \CC \R$ be the space of continuous real-valued functions.

Let $\struct {\map {\CC^1} \R, +, \, \cdot \,}_\R$ be the vector space of continuously differentiable real-valued functions.

Let $S = \span \set {f_1, f_2} \subset \map {\CC^1} \R$ be a vector space.

Let $D : S \to S$ be the derivative $x$.

From Differentiation of Exponential of a x by Cosine of b x and Exponential of a x by Sine of b x wrt x as Invertible Matrix, $D$ is expressible as:


 * $\mathbf D = \begin{bmatrix}

a & b \\ -b & a \end{bmatrix}$

and is invertible.

By Inverse of Matrix is Scalar Product of Adjugate by Reciprocal of Determinant:


 * $\ds \mathbf D^{-1} = \frac 1 {a^2 + b^2} \begin{bmatrix}

a & -b \\ b & a \end{bmatrix}$

Then:


 * $\ds \mathbf D^{-1} \begin{bmatrix} 1 \\ 0 \end {bmatrix} = \frac 1 {a^2 + b^2} \begin{bmatrix}

a & -b \\ b & a \end{bmatrix} \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \frac 1 {a^2 + b^2} \begin{bmatrix} a \\ b \end{bmatrix}$

Application of $\mathbf D$ on both sides on the left and writing out explicitly in terms of $f_1$ and $f_2$ yields:


 * $f_1 = \ds \dfrac \d {\d x} \frac {a f_1 + b f_2} {a^2 + b^2}$

Integrate $x$:


 * $\ds \int f_1 \rd x = \frac {a f_1 + b f_2} {a^2 + b^2} + C$

where $C$ is an arbitrary constant.

Substitute definitions of $f_1$ and $f_2$ to get the desired result.

Example 1
Suppose that:


 * $J \sqbrk y = \int_1^2 \frac {\sqrt {1+y'^2} } {x} \rd x$

with the following boundary conditions:


 * $\map y 1 = 0$


 * $\map y 2 = 1$

Then the smooth minimizer of $J$ is a circle of the following form:


 * $\paren {y - 2}^2 + x^2 = 5$

Proof
$J$ is of the form


 * $J \sqbrk y = \int_a^b \map F {x, y'} \rd x$

Then we can use the "no y theorem":


 * $F_y = C$

i.e.


 * $\frac {y'} {x \sqrt {1 + y'^2} } = C$

or


 * $y' = \frac {C x} {\sqrt {1 - C^2 x^2} }$

The integral is equal to


 * $y = \frac {\sqrt {1 - C^2 x^2} } C + C_1$

or


 * $\paren {y - C_1}^2 + x^2 = C^{-2}$

From the conditions $\map y 1 = 0$, $\map y 2 = 1$ we find that


 * $C = \frac 1 {\sqrt 5}$


 * $C_1 = 2$

Example 3

 * $J \sqbrk = \int_a^b \paren {x - y}^2$

is minimized by


 * $\map y x = x$

Proof
Euler' equation:


 * $F_y = 0$

i.e.


 * $2 \paren {x - y} = 0$.

Example p31
Suppose:


 * $J \sqbrk r = \int_{\phi_0}^{\phi_1} \sqrt{r^2 + r'^2} \rd \phi$

Euler's Equation:


 * $\displaystyle \frac r {\sqrt{r^2 + r'^2} } - \dfrac \d {\d \phi} \frac {r'} {\sqrt{r^2 + r'^2} }$

Apply change of variables:


 * $x = r \cos \phi, y = r \sin \phi$

The integral becomes:


 * $\displaystyle \int_{x_0}^{x_1} \sqrt{1 + y'^2} \rd x$

Euler's equation:


 * $y'' = 0$

Its solution:


 * $y = \alpha x + \beta$

or


 * $r \sin \phi = \alpha r \cos \phi + \beta$

Example

 * $J \sqbrk = \int_{x_0}^{x_1} \map f {x,y} \sqrt {1+y'^2}\rd x$


 * $F_{y'} = \map f {x,y} \frac {y'} {\sqrt{1 + y'^2} }=\frac {y' F} {1 + y'^2}$


 * $F + \paren {\phi' - y'}F_{y'} = \frac {\paren{1+y'\phi'}F} {1+y'^2} = 0$


 * $F + \paren {\psi' - y'}F_{y'} = \frac {\paren{1+y'\psi'}F} {1+y'^2} = 0$

i.e.


 * $y' = -\frac 1 {\phi'}$


 * $y' = - \frac 1 {\psi'}$

Transversality reduces to orthogonality

Example: points on surfaces

 * $J \sqbrk {y,z} = \int_{x_0}^{x_1} \map F {x,y,z,y',z'} \rd x$

Transversality conditions:


 * $\sqbrk {F_{y'} + \dfrac {\partial \phi} {\partial y} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x0} = 0$


 * $\sqbrk {F_{z'} + \dfrac {\partial \phi} {\partial z} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x0} = 0$


 * $\sqbrk {F_{y'} + \dfrac {\partial \phi} {\partial y} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x1} = 0$


 * $\sqbrk {F_{z'} + \dfrac {\partial \phi} {\partial z} \paren {F - y'F_{y'} - z'F_{z'} } }|_{x=x1} = 0$

Example: Legendre transformation

 * $\map f \xi = \frac {\xi^a} a, a>1$


 * $\map {f'} \xi = p = \xi^{a-1}$

i.e.


 * $\xi = p^{\frac {1} {a-1} }$


 * $H = - \frac {\xi^a} {a} + p\xi = - \frac {p^{\frac {a} {a-1} } } a + p p^{\frac {a} {a-1} } = p^{\frac {a} {a-1} } \paren{1 - \frac 1 a}$

Hence:


 * $\map H p = \frac {p^b} b$

where:


 * $\frac 1 a + \frac 1 b = 1$

Example

 * $J \sqbrk y = \int_a^b \paren {Py'^2 + Q y^2} \rd x$


 * $p = 2 P y', H = P y'^2 - Q y^2$

Hence:


 * $H = \frac {p^2} {4 P} - Q y^2$

Canonical equations:


 * $\dfrac {\d p} {\d x} = 2 Q y$


 * $\dfrac {\d y} {\d x} = \frac p {2 P}$

Euler's Equation:


 * $2 y Q - \dfrac \d {\d x} \paren {2 P y'} = 0$

Example: Noether's theorem 1

 * $J \sqbrk y = \int_{x0}^{x1} y'^2 \rd x$

is invariant under the transformation:


 * $x^* = x + \epsilon, y^* = y$


 * $y^* = \map y {x^* - \epsilon} = \map {y^*} {x^*}$

Then:


 * $J \sqbrk {\gamma^*} = \int_{x0^*}^{x1^*} \sqbrk { \dfrac {\d \map {y^*} {x^*} } {\d x^*} } \rd x^* = \int_{x0+\epsilon}^{x_1 + \epsilon} \sqbrk { \dfrac {\d \map y {x^* - \epsilon} } {\d x^*} }^2 \rd x^* = \int_{x0}^{x1} \sqbrk { \dfrac {\d \map y x} {\d x} }^2 \rd x = J \sqbrk \gamma$

Example: Neother's theorem 2

 * $J \sqbrk y = \int_{x_0}^{x_1} x y'^2 \rd x$

Example: Noether's theorem 3

 * $J \sqbrk y = \int_{x_0}^{x_1} \map F {y, y'} \rd x$

Invariant under $x^* = x + \epsilon, y_i^* = y_i$

I.e. $\phi = 1, \psi_i = 0$

reduces to $H = \const$

Momentum of the system:

 * $P_x = \sum_{y = 1}^n p_{ix}, P_y = \sum_{y = 1}^n p_{iy}, P_z = \sum_{z = 1}^n p_{iz}$

(Examples: attraction to a fixed point, attraction to a homogenous distribution on an axis)

Geodetic distance:Examples
If $J$ is arclength, $S$ is distance.

If $J$ is a moment of time to pass a segment of optical medium, then $S$ is the time needed to pass the whole optical body.

If $J$ is action, then $S$ is the minimal action.

Examples of quadratic functionals
1) $B \sqbrk {x, y} = \int_{t_0}^{t_1} \map x t \map y t \rd t$

Corresponding quadratic functional

$A \sqbrk x = \int_{t_0}^{t_1} \map {x^2} t$

2) $B \sqbrk {x, y} = \int_{t_0}^{t_1} \map \alpha t \map x t \map y t \rd t$

Corresponding quadratic functional

$A \sqbrk x = \int_{t_0}^{t_1} \map \alpha t \map {x^2} t \rd t$

3)

$A \sqbrk x = \int_{t_0}^{t_1} \paren {\map \alpha t \map {x^2} t + \map \beta t \map x t \map {x'} t+ \map \gamma t \map {x'^2} t} \rd t$

4)

$B \sqbrk {x, y} = \int_a^b \int_a^b \map K {s, t} \map x s \map y t \rd s \rd t$