Definition:Stability (Differential Equations)

Definition
Often dynamical systems have as their solutions equilibrium points (fixed points) and orbits (cycles), or periodic solutions; stability refers to how robust these solutions are with respect to small changes in their initial conditions. This describes cases, for example, where nearby solutions drift away indefinitely from the equilibrium or orbit, as well as cases where nearby solutions converge.

For first-order autonomous systems, define $\phi(t,x_0)$ to be the unique solution with initial condition $x(0) = x_0$.

Then:
 * Any solution with initial condition $x_0$ is stable on $[0,\infty)$ if:
 * Given any $\epsilon > 0$, there exists a $\delta>0$ such that $\left \Vert {x - x_0}\right \Vert < \delta \implies \left \Vert {\phi(t,x) - \phi(t,x_0)}\right \Vert < \epsilon$.


 * An equilibrium $x_0$ is unstable if it is not stable.
 * An equilibrium $x_0$ is asymptotically stable if:
 * For any $x$ in a sufficiently small neighborhood of $x_0$, it follows that $\displaystyle \lim_{t\to\infty} \phi(t,x) = x_0$.