Simplest Variational Problem with Subsidiary Conditions
Theorem
Let $J \sqbrk y$ and $K \sqbrk y$ be functionals, such that
- $\displaystyle J \sqbrk y = \int_a^b \map F {x, y, y'} \rd x$
- $\displaystyle K \sqbrk y = \int_a^b \map G {x, y, y'} \rd x = l$
where $l$ is a constant.
Let $y = \map y x$ be an extremum of $F \sqbrk y$, and satisfy boundary conditions:
- $\map y a = A$
- $\map y b = B$
Then, if $y = \map y x$ is not an extremal of $K \sqbrk y$, there exists a constant $\lambda$ such that $y = \map y x$ is an extremal of the functional:
- $\displaystyle \int_a^b \paren {F + \lambda G} \rd x$
or, in other words, $y = \map y x$ satisfies:
- $F_y - \dfrac {\d} {\d x} F_{y'} + \lambda \paren {G_y - \dfrac {\d} {\d x} G_{y'} } = 0$
Proof
Let $J \sqbrk y$ be a functional, for which $y = \map y x$ is an extremal with the boundary conditions $\map y a = A, \, \map y b = B$.
Choose two points, $x_1$ and $x_2$ from the interval $\closedint a b$.
Let $\delta_1 \map y x$ and $\delta_2 \map y x$ be functions, different from zero only in the neighbourhood of $x_1$ and $x_2$ respectively.
Then we can exploit the definition of variational derivative in the following way:
- $\Delta J \sqbrk {y; \, \delta_1 \map y x + \delta_2 \map y x} = \paren {\left. {\dfrac {\delta F} {\delta y} }\right \rvert_{x \mathop = x_1} + \epsilon_1} \Delta \sigma_1 + \paren {\left. {\dfrac {\delta F} {\delta y} }\right \rvert_{x \mathop = x_2} + \epsilon_2} \Delta \sigma_2$
where:
- $\displaystyle \Delta \sigma_1 = \int_a^b \delta_1 \, \map y x, \, \Delta \sigma_2 = \int_a^b \delta_2 \map y x$
and $\epsilon_1, \, \epsilon_2 \to 0$ as $\Delta \sigma_1, \, \Delta \sigma_2 \to 0$.
We now require that the varied curve $y^*=\map y x+\delta_1\map y x+\delta_2\map y x$ satisfies the condition $K\sqbrk{y^*}=K\sqbrk y$.
This way we limit arbitrary varied curves to those who still satisfy the given functional constraint.
Similarly, write down $\Delta K\sqbrk y $ as
- $\displaystyle\Delta K\sqbrk y= K\sqbrk{y^*}-K\sqbrk y=\paren{\frac{\delta G}{\delta y}\bigg\rvert_{x=x_1}+\epsilon_1'}\Delta\sigma_1+\paren{\frac{\delta G}{\delta y}\bigg\rvert_{x=x_2}+\epsilon_2'}\Delta\sigma_2$
where $\epsilon_1',~\epsilon_2'\to 0$ as $\Delta\sigma_1,~\Delta\sigma_2\to 0$.
Suppose $x_2$ is such that
- $\displaystyle\frac{\delta G} {\delta y} \bigg\rvert_{x=x_2}\ne 0$
Such a point exists, because by assumption $\map y x$ is not an extremal of $K\sqbrk y$.
Since $\Delta K=0$, the previous equation can be rewritten as
- $\displaystyle\Delta\sigma_2=-\paren{\frac{\frac{\delta G}{\delta y}\bigg\rvert_{x=x_1} }{\frac{\delta G}{\delta y}\bigg\rvert_{x=x_2} }+\epsilon'}\Delta\sigma_1$
where $\epsilon'\to 0$ as $\Delta\sigma_1\to 0$.
Set :
- $\lambda = -\dfrac {\left. {\frac {\delta F} {\delta y} }\right \rvert_{x \mathop = x_2} } {\left. {\frac {\delta G} {\delta y} }\right \rvert_{x \mathop = x_2} }$
Substitute this into the formula for $\Delta J$:
\(\displaystyle \Delta J\sqbrk{y;\delta_1\map y x+\delta_2\map y x}\) | \(=\) | \(\displaystyle \paren{\frac{\delta F}{\delta y}\bigg\rvert_{x=x_1}+\epsilon_1}\Delta\sigma_1+\paren{\frac{\delta F}{\delta y}\bigg\rvert_{x=x_2}+\epsilon_2}\Delta\sigma_2\) | |||||||||||
\(\displaystyle \) | \(=\) | \(\displaystyle \paren{\frac{\delta F}{\delta y}\bigg\rvert_{x=x_1}+\epsilon_1}\Delta\sigma_1+\paren{-\lambda\frac{\delta G}{\delta y}\bigg\rvert_{x=x_2}+\epsilon_2}\sqbrk{ -\left(\frac{\frac{\delta G}{\delta y}\bigg\rvert_{x=x_1} }{\frac{\delta G}{\delta y}\bigg\rvert_{x=x_2} }+\epsilon'\right)\Delta\sigma_1}\) | |||||||||||
\(\displaystyle \) | \(=\) | \(\displaystyle \paren{\frac{\delta F}{\delta y} \bigg\rvert_{x=x_1}+\epsilon_1}\Delta\sigma_1+\paren{\lambda\frac{\delta G}{\delta y}\bigg\rvert_{x=x_1}+\lambda\frac{\delta G}{\delta y}\bigg\rvert_{x=x_2}\epsilon'-\epsilon_2\frac{\frac{\delta G}{\delta y}\bigg\rvert_{x=x_1} }{\frac{\delta G}{\delta y}\bigg\rvert_{x=x_2} }-\epsilon_2\epsilon'}\Delta\sigma_1\) | |||||||||||
\(\displaystyle \) | \(=\) | \(\displaystyle \paren{\frac{\delta F}{\delta y}\bigg\vert_{x=x_1}+\lambda\frac{\delta G}{\delta y}\bigg\vert_{x=x_1} }\Delta\sigma_1+\epsilon\Delta\sigma_1\) |
where $\epsilon\to 0$ as $\Delta\sigma_1\to 0$.
Then the variation of the functional $J\sqbrk y$ at the point $x_1$ is:
- $\delta J=\paren{\frac{\delta F}{\delta y}\bigg\vert_{x=x_1}+\lambda\frac{\delta G}{\delta y}\bigg\vert_{x=x_1} }\Delta\sigma$
A necessary condition for $\delta J$ vanish for any $\Delta\sigma$ and arbitrary $x_1$ is:
$\blacksquare$
Sources
- 1963: I.M. Gelfand and S.V. Fomin: Calculus of Variations ... (previous) ... (next): $\S 2.12$: Variational Problems with Subsidiary Conditions