Necessary Condition for Integral Functional to have Extremum for given function/Dependent on N Functions

Theorem
Let $ \mathbf y $ be an $ n $-dimensional real vector.

Let $ J \left [ { \mathbf y } \right ] $ be a functional of the form

$ \displaystyle J \left [ { \mathbf y } \right ] = \int_a^b F \left ( { x, \mathbf y, \mathbf y' } \right ) \mathrm d x $

Let:


 * $ \mathbf y \in $ $ C^1 \left [ { a \,. \,. \, b } \right ]$

and satisfy boundary conditions:


 * $ \mathbf y \left ( { a } \right ) = \mathbf A $


 * $ \mathbf y \left ( { b } \right ) = \mathbf B $

where $ \mathbf A $, $ \mathbf B $ are real vectors.

Then a necessary condition for $ J \left [ { \mathbf y } \right ] $ to have an extremum (strong or weak) for a given $ \mathbf y $ is that they satisfy Euler's equations:

$ \displaystyle F_{ \mathbf y } - \frac{\mathrm d }{ \mathrm d x } F_{\mathbf y'} = 0 $

Proof
From Condition for Differentiable Functional of N Functions to have Extremum we have


 * $ \displaystyle \delta J \left [ { \mathbf y; \mathbf h } \right ] \bigg \rvert_{ \mathbf y = \hat{ \mathbf y } } = 0 $

For the variation to exist it has to satisfy the requirement for a differentiable functional.

Note that the endpoints of $ \mathbf y \left ( { x } \right ) $ are fixed. $ \mathbf h \left ( { x } \right ) $ is not allowed to change values of $ \mathbf y \left ( { x } \right ) $ at those points.

Hence $ \mathbf h \left ( { a } \right ) = 0 $ and $ \mathbf h \left ( { b } \right ) = 0 $.

We will start from the increment of a functional:



Using multivariate Taylor's theorem, one can expand $ F \left ( { x, \mathbf y + \mathbf h, \mathbf y' + \mathbf h' } \right ) $ with respect to functions $ \mathbf h \left ( { x } \right ) $ and $ \mathbf h' \left ( { x } \right ) $:


 * $\displaystyle

\begin{equation} { \begin{split} F \left ( { x, \mathbf y + \mathbf h, \mathbf y' + \mathbf h' } \right ) = & F \left ( { x, \mathbf y + \mathbf h, \mathbf y' + \mathbf h' } \right ) \bigg \rvert_{ \mathbf h = \mathbf 0, \mathbf h' = \mathbf 0 } + \sum_{ i = 1 }^n \frac{ \partial{ F \left ( { x, \mathbf y + \mathbf h, \mathbf y' + \mathbf h' } \right ) } }{ \partial{ { y_i } } } \bigg \rvert_{ h_i = 0,~h_i' = 0 } { h_i } \\ & + \sum_{ i = 1 }^n \frac{ \partial{ F \left ( { x, \mathbf y + \mathbf h, \mathbf y + \mathbf h } \right)} }{ \partial{ { y }_i' } } \bigg \rvert_{ h_i = 0,~h_i' = 0 } { h_i }' + \mathcal{ O } \left( { h_i h_j, h_i h_j', h_i' h_j'~\text{for}~i,j=(1,~...,~n) } \right ) \end{split} }\end{equation} $

We can substitute this back into the integral. Note that the first term in the expansion and the negative one in the integral will cancel out.


 * $ \displaystyle \Delta J \left [ { \mathbf y; \mathbf h } \right ] = \int_a^b \sum_{ i = 1 }^n \left [ { F_{ y_i } h_i + F_{ y_i' } h_i' + \mathcal{ O } \left ( { h_i h_j, h_i h_j', h_i' h_j' ~ \text{for} ~ i, j = ( { 1,~...,~n } ) } \right ) } \right ] \mathrm d x $

By definition, the integral not counting in $ \mathcal{ O } \left ( { h_i h_j, h_i h_j', h_i' h_j'~\text{for}~i,j=(1,~...,~n) } \right ) $ is a variation of functional:


 * $ \displaystyle \delta J \left [ { \mathbf y; \mathbf h } \right ] = \int_a^b \left [ { F_{ \mathbf y } \mathbf h + F_{ \mathbf y' } \mathbf h' } \right] \mathrm d x $

The variation vanishes if for all functions $h_i$ every term containing $h_i$ vanishes independently.

Therefore, we discover a set of Euler's Equations being satisfied simultaneously:
 * $ \displaystyle F_{ \mathbf y } - \frac{ \mathrm d }{ \mathrm d x } F_{ \mathbf y' } = 0 $