In Section ?? we showed in detail how solutions to planar systems of constant coefficient differential equations with distinct real eigenvalues are found. This method was just reviewed in Section ?? where we saw that the crucial step in solving these systems of differential equations is the step where we find two linearly independent solutions. In this section we discuss how to find these two linearly independent solutions when the eigenvalues of the coefficient matrix are either complex or real and equal.

By finding these two linearly independent solutions we will find both the general solution of the system of differential equations and a method for solving the initial value problem

We assume that is a matrix with eigenvalues and . When needed, we denote the associated eigenvectors by and .

Real Distinct Eigenvalues

We have discussed the case when on several occasions. For completeness we repeat the result. The general solution is:

The initial value problem is solved by finding real numbers and such that See Section ?? for a detailed discussion with examples.

Complex Conjugate Eigenvalues

Suppose that the eigenvalues of are complex, that is, suppose that with is an eigenvalue of with eigenvector , where . We claim that and , where

are solutions to (??) and that the general solution to (??) is: where are real scalars.

There are several difficulties in deriving (??) and (??); these difficulties are related to using complex numbers as opposed to real numbers. In particular, in the derivation of (??) we need to define the exponential of a complex number, and we begin by discussing this issue.

Euler’s Formula

We find complex exponentials by using Euler’s celebrated formula:

for any real number . A justification of this formula is given in Exercise ??. Euler’s formula allows us to differentiate complex exponentials, obtaining the expected result:

Euler’s formula also implies that

where . Most importantly, we note that We use (??) and the product rule for differentiation to verify (??) as follows:
Verification that (??) is the General Solution

A complex vector-valued function consists of a real part and an imaginary part . For such functions we define and To say that is a solution to means that

Proof
Equating the real and imaginary parts of (??) implies that

It follows from Lemma ?? that finding one complex-valued solution to a linear differential equation provides us with two real-valued solutions. Identity (??) implies that is a complex-valued solution to (??). Using Euler’s formula we compute the real and imaginary parts of , as follows.

Since the real and imaginary parts of are solutions to , it follows that the real-valued functions and defined in (??) are indeed solutions.

Returning to the case where is a matrix, we see that if and are linearly independent, then Corollary ?? implies that (??) is the general solution to . The linear independence of and is verified using the following lemma.

Proof
By assumption , that is, Equating real and imaginary parts of (??) leads to the system of equations (??). Note that if , then and . Hence , contradicting the assumption that . So .

Note also that if and are linearly dependent, then . It then follows from the previous equation that Hence is a real eigenvector; but the eigenvalues of are not real and has no real eigenvectors.

An Example with Complex Eigenvalues

Consider an example of an initial value problem for a linear system with complex eigenvalues. Let

and

The characteristic polynomial for the matrix is: whose roots are and . So An eigenvector corresponding to the eigenvalue is It follows from (??) that are solutions to (??) and is the general solution to (??). To solve the initial value problem we need to find such that that is, Therefore, and and

Real and Equal Eigenvalues

There are two types of matrices that have real and equal eigenvalues — those that are scalar multiples of the identity and those that are not. An example of a matrix that has real and equal eigenvalues is

The characteristic polynomial of is Thus the eigenvalues of both equal .
Only One Linearly Independent Eigenvector

An important fact about the matrix in (??) is that it has only one linearly independent eigenvector. To verify this fact, solve the system of linear equations In matrix form this equation is A quick calculation shows that all solutions are multiples of .

In fact, this observation is valid for any matrix that has equal eigenvalues and is not a scalar multiple of the identity, as the next lemma shows.

Proof
Let and be two linearly independent eigenvectors of , that is, . It follows from linearity that for any linear combination . Since and are linearly independent and , it follows that is a basis of . Thus, every vector is a linear combination of and . Therefore, is times the identity matrix.
Generalized Eigenvectors

Suppose that has exactly one linearly independent real eigenvector with real eigenvalue . We call a generalized eigenvector of it satisfies the system of linear equations

The matrix in (??) has a generalized eigenvector. To verify this point solve the linear system for . Note that for this matrix , and are linearly independent. The next lemma shows that this observation about generalized eigenvectors is always valid.

Proof
(a) If and were linearly dependent, then would be a multiple of and hence an eigenvector of . But applied to an eigenvector is zero, which is a contradiction. Therefore, and are linearly independent.

(b) Let be any vector that is linearly independent of the eigenvector . It follows that is a basis for ; hence

for some scalars and . If , then is an eigenvector of , contradicting the assumption that has only one linearly independent eigenvector. Therefore, .

We claim that and we prove the claim by showing that is an eigenvalue of . Hence must equal since both eigenvalues of equal . To see that is an eigenvalue, define the nonzero vector and compute So is an eigenvector of with eigenvalue . It now follows from (??) that Therefore, is a generalized eigenvector of .

Independent Solutions to Differential Equations with Equal Eigenvalues

In the equal eigenvalue one eigenvector case, we claim that the general solution to is:

where is an eigenvector of and is the generalized eigenvector.

Since is an eigenvector of with eigenvalue , the function is a solution to . Suppose that we can show that is also a solution to . Then (??) is the general solution, since and are linearly independent by Lemma ??(a). Apply Theorem ??.

To verify that is a solution (that is, that ), calculate and using (??). Note that , so and are found by solving .

An Example with Equal Eigenvalues

Consider the system of differential equations

with initial value The characteristic polynomial for the matrix is Thus is an eigenvalue of multiplicity two. Since is not a multiple of the identity matrix, it must have precisely one linearly independent eigenvector . This eigenvector is found by solving the equation for To find the generalized eigenvector , we solve the system of linear equations by row reducing the augmented matrix to obtain

We may now apply (??) to find the general solution to (??) We solve the initial value problem by solving for and . So the closed form solution to this initial value problem is

There is a simpler method for finding this solution — a method that does not require solving for either the eigenvector or the generalized eigenvector that we will discuss later. See Section ??.

Exercises

Justify Euler’s formula (??) as follows. Recall the Taylor series
Now evaluate the Taylor series and separate into real and imaginary parts.

In modern language De Moivre’s formula states that In Exercises ?? – ?? use De Moivre’s formula coupled with Euler’s formula (??) to determine trigonometric identities for the given quantity in terms of , , , .

.
.

In Exercises ?? – ?? compute the general solution for the given system of differential equations.

.
.
.
.