Simplest Variational Problem with Subsidiary Conditions

Theorem
Let $J[y]$ and $K[y]$ be functionals, such that


 * $\displaystyle J[y]=\int_{a}^{b}F\left(x,~y,~y'\right)\mathrm{d}{x},~K[y]=\int_{a}^{b}G\left(x,~y,~y'\right)\mathrm{d}{x}=l$

where $l$ is a constant.

Let $y=y(x)$ be an extremum of $F[y]$, and satisfy boundary conditions


 * $y(a)=A,~y(b)=B$

Then, if $y=y(x)$ is not an extremal of $K[y]$, there exists a constant $\lambda$,

such that $y=y(x)$ is an extremal of the functional


 * $\displaystyle\int_{a}^{b}\left(F+\lambda G\right)\mathrm{d}{x}$

or, in other words, $y=y(x)$ satisfies


 * $\displaystyle F_y-\frac{\mathrm{d} }{\mathrm{d}{x} }F_{y'}+\lambda\left(G_y-\frac{\mathrm{d} }{\mathrm{d}{x}}G_{y'}\right)=0$

Proof
Let $J[y]$ be a functional, for which $y=y(x)$ is an extremal with the boundary conditions $y(a)=A,~y(b)=B$.

Choose two points, $x_1$ and $x_2$ from the interval $\left[{{a}\,.\,.\,{b}}\right]$.

Let $\delta_1 y(x)$ and $\delta_2 y(x)$ be functions, different from zero only in the neighbourhood of $x_1$ and $x_2$ respectively.

Then we can exploit the definition of variational derivative in a following way:


 * $\displaystyle\Delta J\left[y;~\delta_1y(x)+\delta_2y(x)\right]=\left(\frac{\delta F}{\delta{y}}\bigg\rvert_{x=x_1}+\epsilon_1\right)\Delta\sigma_1+\left(\frac{\delta F}{\delta{y}}\bigg\rvert_{x=x_2}+\epsilon_2\right)\Delta\sigma_2$

where


 * $\displaystyle\Delta\sigma_1=\int_{a}^{b}\delta_1y(x),~\Delta\sigma_2=\int_{a}^{b}\delta_2y(x)$

and $\epsilon_1,~\epsilon_2\to 0$ as $\Delta\sigma_1,~\Delta\sigma_2\to 0$.

We now require that the varied curve $y^*=y(x)+\delta_1y(x)+\delta_2y(x)$ satisfies the condition $K[y^*]=K[y]$.

This way we limit arbitrary varied curves to those who still satisfy the given functional constraint.

Similarly, write down $\Delta K[y]$ as


 * $\displaystyle\Delta K[y]= K[y^*]-K[y]=\left(\frac{\delta G}{\delta{y}}\bigg\rvert_{x=x_1}+\epsilon_1'\right)\Delta\sigma_1+\left(\frac{\delta G}{\delta{y}}\bigg\rvert_{x=x_2}+\epsilon_2'\right)\Delta\sigma_2$

where $\epsilon_1',~\epsilon_2'\to 0$ as $\Delta\sigma_1,~\Delta\sigma_2\to 0$.

Suppose $x_2$ is such that


 * $\displaystyle\frac{\delta G} {\delta y} \bigg\rvert_{x=x_2}\ne 0$

Such a point exists, because by assumption $y(x)$ is not an extremal of $K[y]$.

Since $\Delta K=0$, the previous equation can be rewritten as


 * $\displaystyle\Delta\sigma_2=-\left(\frac{ \frac{\delta G}{\delta y}\bigg\rvert_{x=x_1} }{ \frac{\delta G}{\delta y}\bigg\rvert_{x=x_2} }+\epsilon'\right)\Delta\sigma_1$

where $\epsilon'\to 0$ as $\Delta\sigma_1\to 0$.

Set


 * $\displaystyle\lambda=-\frac{\frac{\delta F}{\delta y}\bigg\rvert_{x=x_2} }{ \frac{\delta G}{\delta y}\bigg\rvert_{x=x_2} }$

Substitute this into formula for $\Delta J$:

where $\epsilon\to 0$ as $\Delta\sigma_1\to 0$.

Then the variation of the functional $J[y]$ at the point $x_1$ is:


 * $\delta J=\left(\frac{\delta F}{\delta y}\bigg\vert_{x=x_1}+\lambda\frac{\delta G}{\delta y}\bigg\vert_{x=x_1}\right)\Delta\sigma$

A necessary condition for $\delta J$ vanish for any $\Delta\sigma$ and arbitrary $x_1$ is:


 * $\displaystyle\frac{\delta F}{\delta y}+\lambda\frac{\delta G}{\delta y}=F_y-\frac{\mathrm{d} }{\mathrm{d}{x} }F_{y'}+\lambda\left(G_y-\frac{\mathrm{d} }{\mathrm{d}{x}}G_{y'}\right)=0$