Math Insight

Dynamical system definition

 

A dynamical system is a system whose state evolves with time over a state space according to a fixed rule.

For an introduction into the concepts behind a dynamical system, see the idea of a dynamical system.

Formal definition of dynamical system

A dynamical system is formally defined as a state space $X$, a set of times $T$, and a rule $R$ that specifies how the state evolves with time. The rule $R$ is a function whose domain is $X \times T$ (confused?) and whose codomain is $X$, i.e., $R : X \times T \to X$ (confused?). The rule function $R$ means that the $R$ takes two inputs, $R=R(\vc{x},t)$, where $\vc{x} \in X$ (confused?) is the initial state (at time $t=0$, for example) and $t \in T$ is a future time. In other words, $R(\vc{x},t)$ gives the state at time $t$ given that the initial state was $\vc{x}$.