Definition:Dynamical System

From ProofWiki
Jump to navigation Jump to search

Definition

A dynamical system is a nonlinear system in which a function describes the time dependence of a point in a geometrical space $X$.

It is an iterative procedure consisting of:

a mapping $T$ of a $X$ onto itself

and:

an iteration $x_{n + 1} = \map T {x_n}$.

Hence positions of points in $X$ evolve iteratively under $T$.


Flow

In a dynamical system, a set of time-dependent equations is known as flow.


Orbit

Let $S$ be a dynamical system consisting of a mapping $T$ and iteration $x_{n + 1} = \map T {x_n}$.


The orbit of $x_0$ is the set of points $\set {x_0, x_1, x_2, \ldots}$


When $T$ is invertible, the orbit usually includes the points $x_{-1} = \inv T x, x_{-2} = \inv T {x_{-1} }, \ldots$

For a flow, the orbit of a point $x$ is the union of all the points $\map x t$ for all $t$.


Examples

Arbitrary Example

Let $T: \C \to \C$ be the mapping on the complex plane defined as:

$\forall z \in \C: \map T z = z^2 - 1$

Hence $T$ and the iteration $z_{n + 1} = {z_n}^2 - 1$ together define a dynamical system in the complex plane.


Also see

  • Results about dynamical systems can be found here.


Sources