Definition:Stability (Differential Equations)

Definition
For first-order autonomous systems, define $\phi \left({t, x_0}\right)$ to be the unique solution with initial condition $x \left({0}\right) = x_0$.

Then a solution with initial condition $x_0$ is stable on $\left[{0 \,.\,.\, \to}\right)$ if:


 * given any $\epsilon > 0$, there exists a $\delta > 0$ such that $\left\Vert{x - x_0}\right\Vert < \delta \implies \left\Vert {\phi \left({t, x}\right) - \phi \left({t, x_0}\right)}\right\Vert < \epsilon$

An equilibrium $x_0$ is unstable it is not stable.

An equilibrium $x_0$ is asymptotically stable :
 * For any $x$ in a sufficiently small neighborhood of $x_0$:
 * $\displaystyle \lim_{t \to \infty} \phi \left({t, x}\right) = x_0$