You are about to erase your work on this activity. Are you sure you want to do this?

Updated Version Available

There is an updated version of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?

Mathematical Expression Editor

We summarize the properties of the determinant that we already proved, and prove
that a matrix is singular if and only if its determinant is zero, the determinant of a
product is the product of the determinants, and the determinant of the transpose is
equal to the determinant of the matrix.

DET-0040: Properties of the Determinant

Summary of Results

We first defined the determinant of a matrix using cofactor expansion along the first
row. (Definition def:toprowexpansion of DET-0010) We then introduced an alternative definition in terms
of cofactor expansion along the first column and proved that the two definitions are
equivalent. (Definition def:firstcolexpansion1 of DET-0020) We also proved the following simple but useful
results:

(a)

The determinant of the identity matrix is equal to 1. (Lemma lemma:detofid)

(b)

If a matrix contains a row of zeros, then its determinant is equal to 0.
(Lemma lemma:det0lemma)

(c)

If two rows of a matrix are the same, then the determinant of the matrix
is equal to 0. (Lemma lemma:det0lemma)

(d)

If one row of a matrix is a scalar multiple of another row, then the
determinant of the matrix is equal to 0. (Lemma lemma:det0lemma)

We also found that elementary row operations affect the determinant as
follows:

(a)

If is obtained from by interchanging two different rows, then

(b)

If is obtained from by multiplying one of the rows of by a non-zero
constant . Then

(c)

If is obtained from by adding a multiple of one row of to another row,
then

In this module we will prove the following important results:

(a)

A square matrix is singular if and only if its determinant is equal to 0.

(b)

The determinant of a product is the product of the determinants.

(c)

The determinant of the transpose is equal to the determinant of the
matrix.

To get us started, we need the following lemma.

Let be a square matrix, and let be
an elementary matrix, then

Proof

Recall that if is obtained from using an elementary row operation,
then the same elementary row operation carries to . There are three types of
elementary row operations and three types of elementary matrices, so we will
have to consider three cases.

Case 1. Suppose is obtained from by interchanging two rows, then
so

Case 2. Suppose is obtained from by multiplying one of the rows of by a
non-zero constant , then
so

Case 3. Suppose is obtained from by adding a scalar multiple of one row to
another row, then
so

Invertibility and the Determinant

Recall that we first introduced determinants in the context of invertibility of
matrices. Specifically, we found that is invertible if and only if . (A logically
equivalent statement is: is singular if and only if .) We are now in the position to
prove this result for all square matrices.

A square matrix is singular if and only if .

Proof

Let be a square matrix. To determine whether is singular we need to
find . In MAT-0060 we found that there exist elementary matrices such that
so
By repeated application of Lemma lemma:detelemproduct, we find that
Suppose that is singular, then . But then contains a row of zeros, and . (Lemma
lemma:det0lemma) Since determinants of elementary matrices are non-zero, we conclude that .

Conversely, suppose , then
But then , so is singular.

Determine whether is an invertible matrix without using elementary row
operations.

Compute the determinant of . You will find that . By Theorem th:detofsingularmatrix we conclude that is
not invertible.

Determinant of a Product

Let and be square matrices, then

Proof

Suppose is invertible, then can be written as a product of elementary
matrices. (Theorem th:elemmatrices of MAT-0060)
Then, by repeated application of Lemma lemma:detelemproduct, we get

Now suppose that is not invertible. Then is also not invertible. So, and . Thus
.

In Practice Problem prob:dettranspose of DET-0020 you proved that . Your proof most likely relied on
the fact that cofactor expansion along the first row produces the same result as
cofactor expansion along the first column. (Theorem th:rowcolexpequivalence). We will now take another look
at this result and prove it without the assumption that the two cofactor
expansions produce the same outcome.

In this problem we will take a look
at the determinants of transposes of elementary matrices. Recall that an
elementary matrix is obtained from the identity matrix by means of one
elementary row operation. Consider the following examples of elementary
matrices.
On your own, write out the transpose of each matrix. You should observe that and
.

Now consider
Clearly , but what is really important is that is also an elementary matrix. While
was obtained from the identity by adding 4 times the third row to the first row, was
obtained from the identity by adding 4 times the first row to the third row. By
Theorem th:elemrowopsanddetitem:addmultotherrowdet of DET-0030, we know that .

So, for all three matrices we have .

We can generalize our observations in Exploration init:detoftranspose as follows:

(a)

If is an elementary matrix obtained from the identity by switching of two
rows, then .

(b)

If is an elementary matrix obtained from the identity by multiplying one
row by a non-zero constant, then .

(c)

If is an elementary matrix obtained from the identity by adding a multiple
of one row to another row, then .

We will now combine the three parts of our generalization into a lemma.

Let be an elementary matrix, then

Proof

We will need to consider three cases.

Case 1. Suppose is obtained from the identity by switching rows and . Then is
the same as the identity matrix, except that and entries of are zero, and has
a 1 in and spots. When the transpose is taken, 1’s and 0’s along the diagonal
stay in place, while the -entry becomes the -entry and the -entry becomes the
-entry. This shows that , and .

Case 2. Suppose is obtained from the identity by multiplying one of the rows
by a non-zero constant. Observe that is a diagonal matrix, so and .

Case 3. Suppose is obtained from the identity by adding a times row to row
. Then is the same as the identity matrix, except that the -entry of is . If
we take the transpose of , then will become the -entry of the transpose. This
means that can be obtained from the identity by adding times row to row .
This means that .

We are now ready to prove the main result of this section.

Let be a square matrix, then

Proof

Suppose that is singular, then is also singular. (Theorem th:invprop of
MAT-0050 and Theorem th:transposeproperties of MAT-0025) Thus, .