The stability of equilibria for discrete dynamical systems
Overview
Equilibria of discrete dynamical systems can be stable or unstable, depending on whether or not trajectories that start near the equilibria move away from the equilibria. Here we show how to determine this stability using the derivative of the updating function.
Stability of equilibria of discrete dynamical systems, revisited.
Stability theorem
We can summarize the results for the stability of discrete dynamical systems with the following stability theorem.
Consider the discrete dynamical system \begin{align*} x_{n+1} &= f(x_n)\\ x_0 &= a, \end{align*} with an equilibrium1 $x_n = E$. Then, we can determine the stability of the equilibrium by calculating the derivative of $f$ evaluated at the equilbrium as follows.
- If $|f'(E)| < 1$, then the equilibrium $x_n=E$ is stable.
- If $|f'(E)| > 1$, then the equilibrium $x_n = E$ is unstable.
- If $|f'(E)|=1$, then we can't figure out the stability of the equilibrium from just the derivative $f'(E)$.
Thread navigation
Elementary dynamical systems
- Previous: Review questions: Elementary discrete dynamical system biology problems
- Next: Examples of determining the stability of equilibria for discrete dynamical systems
Math 1241, Fall 2020
- Previous: Problem set: Determining stability by cobwebbing linear approximations around equilibria
- Next: Examples of determining the stability of equilibria for discrete dynamical systems