Before we start discussing how we'll use those complex numbers, we should remember an important point. The parameters $\alpha$, $\beta$, $\gamma$, and $\delta$ are real numbers. Any initial conditions we'll use will be real numbers. If $\diff{\vc{x} }{t}=A\vc{x}$, can the solution $\vc{x}$ take on complex values? yes no The differential equation actually does not involve any complex numbers; the solution does not involve any complex numbers. We don't even need to involve complex numbers to discuss the dynamics of this system.
So, why are we going to involve complex numbers? Because it allows us to avoid doing any more work. If we use complex numbers, we can use the same solution formula we've been using all along. If $\lambda_1$ and $\lambda_2$ are the eigenvalues of $A$ with eigenvectors $\vc{u}_1$ and $\vc{u}_2$, what is the formula for the general solution of the differential equation? $\vc{x}(t) =$ (As before, use $c_1$ and $c_2$ for the constants that would be determined from the initial conditions.)
It turns out that this formula makes sense even if the eigenvalues are complex numbers. You'll just have to trust the fact that in the end, all the imaginary parts cancel out and $\vc{x}(t)$ ends up being real.
Ignoring all the pesky details about how $\vc{x}(t)$ ends up still being real, we just have one problem to face: how do we intepret $e^{\lambda_1 t}$ means when $\lambda_1$ is complex.
The key to unlocking the secret of the complex eigenvalue is Euler's formula which defines the exponential of an imaginary number. Euler's formula says that, if $t$ is any real number (which means $i\cdot t$ is a purely imaginary number), then $$e^{it} = \cos(t) + i \sin(t).$$ We're not going to worry too much about why that's true. That's left as an exercise! We just need two important properties.
First, what's the maximum value of $\sin(t)$ and $\cos(t)$? What's the minimum value of $\sin(t)$ and $\cos(t)$? . Moreover, what's magnitude of $e^{it}$? $|e^{it}| = \sqrt{\sin^2(t)+\cos^2(t)} = $ Therefore, can $e^{it}$ get really large? yes no Can $e^{it}$ get really small? yes no So, if we care about if something gets really large or really small, we actually don't care about a factor of $e^{it}$. If we multiply a number by $e^{it}$, it doesn't affect the size of the number.
Second, if we think of $t$ as representing time, what is the behavior of the functions $\sin(t)$ and $\cos(t)$ as $t$ increases? They oscillate. They stay constant. They grow exponentially. They shrink exponentially. That's what we'll get when we multiply a number by the function $e^{it}$ (if we view $e^{it}$ as a function of time). A factor of $e^{it}$ will just introduce decay nervousness growth oscillations.
If you remember your rules for exponentiation, that's enough to understand what $e^{\lambda t}$ means when $\lambda$ is a complex number. If $\lambda = a + ib$, then $$e^{\lambda t} = e^{at + ibt} = e^{at} \cdot e^{ibt}. $$
Since $e^{ibt}$ only introduces oscillations, what is the condition on $a$ and/or $b$ for $e^{\lambda t}$ to be decreasing as $t$ increases? What is the condition for $e^{\lambda t}$ to be increasing as $t$ increases? . What happens if $a=0$? Then $e^{\lambda t}$ shrinks grows neither grows nor shrinks as it oscillates changes steadily.
Rather than using $a$ or $b$, we'll usually talk about the real part of $\lambda$, denoted $\text{Re}(\lambda)$, and the imaginary part of $\lambda$, denoted $\text{Im}(\lambda)$. In that notation, the condition for $e^{\lambda t}$ growing is , and the condition for $e^{\lambda t}$ shrinking is . Of course, if $\lambda$ is real, then $\text{Re}(\lambda)=\lambda$. That means that these conditions,
We can now give the general criterion for the stability of the equilibrium $(0,0)$ of our linear system $\diff{\vc{x} }{t} = A\vc{x}$. Given that the solution is $\vc{x}(t) = c_1e^{\lambda_1t}\vc{u}_1 + c_2e^{\lambda_2t}\vc{u}_2$, the equilibrium is stable if both terms are shrinking. The condition for stability, valid regardless if the eigenvalues are real or complex, is . (You results should be inequalities in terms of $\text{Re}(\lambda_1)$ and $\text{Re}(\lambda_2)$, combined with an “and” or an “or.”)
If the eigenvalues do happen to be complex, then $\lambda_1$ and $\lambda_2$ will be complex conjugates, meaning $\text{Re}(\lambda_1) = \text{Re}(\lambda_2)$ and $\text{Im}(\lambda_1) = - \text{Im}(\lambda_2)$. (You get this result from the quadratic formula. We often say the eigenvalues are in the form $a \pm bi$.) This means that $e^{\lambda_1 t}$ and $e^{\lambda_2 t}$ will be growing or shrinking at different rates the same rate when the eigenvalues are complex.
When the eigenvalues were real, we classified the equilibrium as a stable node, an unstable node, or a saddle. Complex eigenvalues add three more possibilities classifications. In all cases, the imaginary part of the eigenvalues creates growth oscillations decay nervousness in $e^{\lambda_1 t}$ and $e^{\lambda_2 t}$ due to the presence of the sines and cosines. (Since $\lambda_1$ and $\lambda_2$ will be complex conjugates, all oscillations will be at the same rate.)
Let's explore different examples of systems with complex eigenvalues. You can use the below applet to visualize their behavior. The applet will also show you the solution where the complex exponentials have been turned into sines and cosines using Euler's formula. You don't need to worry how to calculate such solutions. Notice how the solutions are real; all the imaginary parts of the solution have canceled out. The real part of the eigenvalues appears in the exponentials, determining the growth or decay. The imaginary part of the eigenvalues appears in the sines and cosines, determining the oscillations.
Initial conditions: $x_0=$ $y_0=$
Maximum x and y for axes: Maximum $t$:
(Hide)
The eigenvalues of \[{}\] are `{}` and `{}`.
The corresponding eigenvectors are `{}` and `{}`. (The applet calculates only real eigenvectors. If the eigenvectors are complex or there is only one eigenvector, it will display a “?” in place of an eigenvector.)
For the linear dynamical system \[{}\] the solution is \[{}\]
For any initial condition, other than $(x_0,y_0)= (0,0)$, the solution moves closer to away from the equilibrium at the origin as it rotates around the equilibrium heads in a straight line.
For any initial condition, other than $(x_0,y_0)= (0,0) $, the solution moves away from closer to the equilibrium at the origin as it heads in a straight line rotates around the equilibrium.