### Equilibrium definition

An equilibrium of a dynamical system is a value of the state variables where the state variables do not change. In other words, an equilibrium is a solution that does not change with time. This means if the systems starts at an equilibrium, the state will remain at the equilibrium forever.

In a discrete dynamical system, such as \begin{align*} x_{n+1} = f(x_n) \end{align*} (in function iteration form) or \begin{align*} x_{n+1}-x_n = g(x_n) \end{align*} (in difference form), one can find the equilibria by substituting in the same quantity for $x_{n}$ and $x_{n+1}$, such as substituting $x_{n+1}=x_n = E$. One must then solve the equation \begin{align*} E = f(E) \end{align*} or \begin{align*} 0 = g(E) \end{align*} to determine the values $E$ such that $x_n = E$ is an equilibrium of the dynamical system. See equilibria in discrete dynamical systems.

In a continuous dynamical system, such as \begin{align*} \diff{x}{t} = f(x), \end{align*} one can find the equilibrium by setting $\diff{x}{t}=0$. One must then solve the equation \begin{align*} 0 = f(E) \end{align*} to determine values $E$ such that $x(t)=E$ is an equilibrium of the dynamical system.