Definition:Stability (Differential Equations)

Definition
For first-order autonomous systems, define $\map \phi {t, x_0}$ to be the unique solution with initial condition $\map x 0 = x_0$.

Then a solution with initial condition $x_0$ is stable on $\hointr 0 \to$ :


 * given any $\epsilon > 0$, there exists a $\delta > 0$ such that $\norm {x - x_0} < \delta \implies \norm {\map \phi {t, x} - \map \phi {t, x_0} } < \epsilon$

An equilibrium $x_0$ is unstable it is not stable.

An equilibrium $x_0$ is asymptotically stable :
 * For any $x$ in a sufficiently small neighborhood of $x_0$:
 * $\ds \lim_{t \mathop \to \infty} \map \phi {t, x} = x_0$