Necessary Condition for Integral Functional to have Extremum for given function/Dependent on N Functions

From ProofWiki
Jump to: navigation, search

Theorem

Let $\mathbf y$ be an $n$-dimensional real vector.

Let $J\sqbrk {\mathbf y}$ be a functional of the form

$\displaystyle J\sqbrk {\mathbf y}=\int_a^b \map F {x,\mathbf y,\mathbf y'}\rd x$

Let:

$\mathbf y\in$ $ C^1\closedint a b$

and satisfy boundary conditions:

$\map {\mathbf y} a=\mathbf A$
$\map {\mathbf y} b=\mathbf B$

where $\mathbf A$, $\mathbf B$ are real vectors.


Then a necessary condition for $J\sqbrk {\mathbf y}$ to have an extremum (strong or weak) for a given $\mathbf y$ is that they satisfy Euler's equations:

$\displaystyle F_{\mathbf y}-\frac{\d}{\d x}F_{\mathbf y'}=0$


Proof

From Condition for Differentiable Functional of N Functions to have Extremum we have

$\displaystyle\delta J\sqbrk {\mathbf y;\mathbf h}\bigg\rvert_{\mathbf y=\hat{\mathbf y} }=0$

For the variation to exist it has to satisfy the requirement for a differentiable functional.

Note that the endpoints of $\map {\mathbf y} x$ are fixed. $\map {\mathbf h} x$ is not allowed to change values of $\map {\mathbf y} x$ at those points.

Hence $\map {\mathbf h} a=0$ and $\map {\mathbf h} b=0$.

We will start from the increment of a functional:

\(\displaystyle \Delta J\sqbrk {\mathbf y;\mathbf h}\) \(=\) \(\displaystyle J\sqbrk {\mathbf y+\mathbf h}-J\sqbrk {\mathbf y}\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \int_a^b \map F {x,\mathbf y+\mathbf h,\mathbf y'+\mathbf h'}\rd x-\int_a^b \map F {x,\mathbf y,\mathbf y'}\rd x\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \int_a^b\sqbrk {\map F{x,\mathbf y+\mathbf h,\mathbf y'+\mathbf h'}-\map F {x,\mathbf y,\mathbf y'} } \rd x\) $\quad$ $\quad$

Using multivariate Taylor's theorem, one can expand $\map F {x,\mathbf y+\mathbf h,\mathbf y'+\mathbf h'}$ with respect to functions $\map {\mathbf h} x$ and $\map {\mathbf h'} x$:

$\displaystyle \begin{equation} { \begin{split} \map F {x,\mathbf y+\mathbf h,\mathbf y'+\mathbf h'}= &\map F {x,\mathbf y+\mathbf h,\mathbf y'+\mathbf h'}\bigg\rvert_{\mathbf h=\mathbf 0,\mathbf h'=\mathbf 0}+\sum_{i=1}^n\frac{ \partial{\map F {x,\mathbf y+\mathbf h,\mathbf y'+\mathbf h'} } }{ \partial{ {y_i} } }\bigg \rvert_{h_i=0,~h_i'=0}{h_i}\\ &+\sum_{i=1}^n\frac{\partial{ \map F {x,\mathbf y+\mathbf h,\mathbf y+\mathbf h} } }{\partial{ {y}_i' } }\bigg\rvert_{h_i=0,~h_i'=0} {h_i}'+\map {\mathcal O } {h_i h_j,h_i h_j',h_i' h_j'~\text{for}~i,j=\openint 1 n }\end{split} }\end{equation}$

We can substitute this back into the integral. Note that the first term in the expansion and the negative one in the integral will cancel out.

$\displaystyle\Delta J\sqbrk {\mathbf y;\mathbf h}=\int_a^b\sum_{i=1}^n \sqbrk { F_{y_i} h_i + F_{y_i'} h_i'+\map {\mathcal O} {h_i h_j,h_i h_j',h_i' h_j'~\text{for}~i,j=\openint 1 n } }\rd x$



By definition, the integral not counting in $\map {\mathcal O} {h_i h_j,h_i h_j',h_i' h_j'~\text{for}~i,j=\openint 1 n }$ is a variation of functional:

$\displaystyle\delta J\sqbrk{\mathbf y;\mathbf h}=\int_a^b\sqbrk {F_{\mathbf y}\mathbf h+F_{\mathbf y'}\mathbf h'}\rd x$

The variation vanishes if for all functions $h_i$ every term containing $h_i$ vanishes independently.

Therefore, we discover a set of Euler's Equations being satisfied simultaneously:

$\displaystyle F_{\mathbf y}-\frac{\d}{\d x}F_{\mathbf y'}=0$

$\blacksquare$

Sources