Jacobi's Theorem

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\mathbf y = \sequence {y_i}_{1 \le i \le n}$, $\boldsymbol \alpha = \sequence {\alpha_i}_{1 \le i \le n}$, $\boldsymbol \beta = \sequence {\beta_i}_{1 \le i \le n}$ be vectors, where $\alpha_i$ and $ \beta_i$ are parameters.

Let $S = \map S {x, \mathbf y, \boldsymbol \alpha}$ be a complete solution of Hamilton-Jacobi equation.

Let:

$\begin {vmatrix} \dfrac {\partial^2 S} {\partial \alpha_i \partial y_k} \end{vmatrix} \ne 0$

Let:

$\dfrac {\partial S} {\partial \alpha_i} = \beta_i$

Then

$p_i = \map {\dfrac {\partial S} {\partial y_i} } {x, \mathbf y, \boldsymbol \alpha}$
$y_i = \map {y_i} {x, \boldsymbol \alpha, \boldsymbol \beta}$

constitute a general solution of the canonical Euler's equations.


Proof 1

Consider the total derivative of $\displaystyle\frac{\partial S}{\partial\alpha_i}$ with respect to $x$:

\(\displaystyle \frac \d {\d x}\frac{\partial S}{\partial\alpha_i}\) \(=\) \(\displaystyle \frac{\partial^2 S}{\partial x\partial\alpha_i}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_j\partial\alpha_i}\frac{\d y_j}{\d x}+\sum_{j=1}^n\frac{\partial^2 S}{\partial\alpha_j\partial\alpha_i}\frac{\d\alpha_j}{\d x}\)
\(\displaystyle \) \(=\) \(\displaystyle \frac{\partial^2 S}{\partial x\partial\alpha_i}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_j\partial\alpha_i}\frac{\d y_j}{\d x}\) $\alpha_j$ is a parameter, independent of $x$
\(\displaystyle \) \(=\) \(\displaystyle -\frac{\partial H}{\partial\alpha_i}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_j\partial\alpha_i}\frac{\d y_j}{\d x}\) $S$ satisfies Hamilton-Jacobi equation
\(\displaystyle \) \(=\) \(\displaystyle -\sum_{j=1}^n\frac{\partial H}{\partial p_j}\frac{\partial p_j}{\partial\alpha_i}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_j\partial\alpha_i}\frac{\d y_j}{\d x}\)
\(\displaystyle \) \(=\) \(\displaystyle -\sum_{j=1}^n\frac{\partial H}{\partial p_j}\frac{\partial^2 S}{\partial\alpha_i\partial y_j}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_j\partial\alpha_i}\frac{\d y_j}{\d x}\) By Derivation of Hamilton-Jacobi Equation
\(\displaystyle \) \(=\) \(\displaystyle \sum_{j=1}^n\map {\frac{\partial^2 S}{\partial x\partial\alpha_i} } {\frac{\d y_j}{\d x}-\frac{\partial H}{\partial p_j} }\)
\(\displaystyle \) \(=\) \(\displaystyle 0\) $ \displaystyle \frac{\d\beta_i}{\d x}=0$
\(\displaystyle \) \(\rightsquigarrow\) \(\displaystyle \frac{\d y_j}{\d x}=\frac{\partial H}{\partial p_j}\)

Next, consider the total derivative of $p_i$ with respect to $x$:

\(\displaystyle \frac{\d p_i}{\d x}\) \(=\) \(\displaystyle \frac \d {\d x}\frac{\partial S}{\partial y_i}\)
\(\displaystyle \) \(=\) \(\displaystyle \frac{\partial^2 S}{\partial x\partial y_i}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_j\partial y_i}\frac{\d y_j}{\d x}+\sum_{j=1}^n\frac{\partial^2 S}{\partial\alpha_j\partial y_i} \frac{\d\alpha_j}{\d x}\)
\(\displaystyle \) \(=\) \(\displaystyle \frac{\partial^2 S}{\partial x\partial y_i}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_j\partial y_i}\frac{\d y_j}{\d x}\) $ \alpha_j$ is a parameter, independent of $x$
\(\displaystyle \) \(=\) \(\displaystyle \frac{\partial^2 S}{\partial x\partial y_i}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_j\partial y_i}\frac{\partial H}{\partial p_j}\) $\displaystyle\frac{\d y_j}{\d x}=\frac{\partial H}{\partial p_j}$

On the other hand, partial derivative of Hamilton-Jacobi equation yields

\(\displaystyle \frac{\partial^2 S}{\partial x\partial y_i}\) \(=\) \(\displaystyle -\frac{\partial H}{\partial y_i}-\sum_{j=1}^n\frac{\partial H}{\partial p_j}\frac{\partial p_j}{\partial y_i}\)
\(\displaystyle \) \(=\) \(\displaystyle -\frac{\partial H}{\partial y_i}-\sum_{j=1}^n\frac{\partial H}{\partial p_j}\frac{\partial^2 S}{\partial y_i\partial y_j}\) By Derivation of Hamilton-Jacobi Equation
\(\displaystyle \) \(\rightsquigarrow\) \(\displaystyle -\frac{\partial H}{\partial y_i}=\frac{\partial^2 S}{\partial x\partial y_i}+\sum_{j=1}^n\frac{\partial^2 S}{\partial y_i\partial y_j}\frac{\partial H}{\partial p_j}\)

By comparison of this and previous expressions:

$\displaystyle\frac{\d p_i}{\d x}=-\frac{\partial H}{\partial y_i}$

$\blacksquare$

Proof 2

Consider canonical Euler's equations:

$\displaystyle\frac{\d y_i}{\d x}=\frac{\partial H}{\partial p_i},\quad\frac{\d p_i}{\d x}=-\frac{\partial H}{\partial y_i}$

Apply a canonical transformation $\paren{x,\mathbf y,\mathbf p,H}\to\paren{x,\boldsymbol \alpha,\boldsymbol\beta,H^*}$, where $\Phi=S$.

By Conditions for Transformation to be Canonical:

$\displaystyle p_i=\frac{\partial S}{\partial y_i},\quad\beta_i=\frac{\partial S}{\partial\alpha_i},\quad H^*=H+\frac{\partial S}{\partial x}$

Since $S$ satisfies Hamilton-Jacobi equation, $ H^*=0$.

In these new coordinates canonical Euler's equations are:

$\displaystyle\frac{\d\alpha_i}{\d x}=\frac{\partial H^*}{\partial\beta_i}$
$\displaystyle \frac{\d\beta_i}{\d x}=-\frac{\partial H^*}{\partial\alpha_i}$

By $H^*=0$:

$\displaystyle\frac{\d\alpha_i}{\d x}=0,\quad\displaystyle\frac{\d\beta_i}{\d x}=0$

which imply that $ \alpha_i$ and $\beta_i$ are constant along each extremal.

$\beta_i$ constancy provides with $n$ first integrals:

$\displaystyle\frac{\partial S}{\partial\alpha_i}=\beta_i$

Because $S=\map S {x,\mathbf y,\boldsymbol\alpha}$, the aforementioned set of first integrals is also a system of equations for functions $y_i$.

Thus, functions $y_i$ can be found.

Functions $p_i$ are found by the results of Conditions for Transformation to be Canonical:

$p_i = \dfrac {\partial} {\partial y_i} \map S {x, \mathbf y, \boldsymbol \alpha}$

Then

$\map {y_i} {x, \boldsymbol \alpha, \boldsymbol \beta}$
$\map {p_i} {x, \boldsymbol \alpha, \boldsymbol \beta}$

are solutions to canonical Euler's equations.

$\blacksquare$


Source of Name

This entry was named for Carl Gustav Jacob Jacobi.


Sources