We define the determinant of a square matrix in terms of cofactor expansion along the first column, and show that this definition is equivalent to the definition in terms of cofactor expansion along the first row.

DET-0020: Definition of the Determinant – Expansion Along the First Column

In DET-0010 we described the determinant as a function that assigns a scalar to every square matrix. The value of the function is given by cofactor expansion along the first row of the matrix. In this module we will mimic this process, but expand along the first column instead of the first row. Surprisingly, our new approach to will yield the same result as the original definition. We will conclude this module by proving that the two expansions will produce the same result for any matrix. This will allow us to state an alternative definition of the determinant in terms of cofactor expansion along the first column.

We will begin by revisiting Examples ex:threebythreedet1 and ex:expansiontoprow from DET-0010.

Let In Example ex:threebythreedet1 we found that . Let’s try to mimic what we did earlier, but instead of doing cofactor expansion along the first row, we will do the expansion along the fist column. We will form each minor matrix by picking an entry from the first column, deleting the first column and deleting the row of the chosen entry. We will also be following the same alternating sign pattern as before.
Let’s go through this process again for a larger matrix.
Let In Example ex:expansiontoprow we found that . We will now try to expand along the fist column.

When computing determinants of the four matrices below, try different approaches. You might want to expand along the first row for some of them, and along the first column for others. Looking for where zeros are located will help you decide what to try.

According to our current definition (Definition def:toprowexpansion of DET-0010), we compute the determinant by doing cofactor expansion along the first row, as follows:

Let be an matrix. Define the determinant of by

or

This definition uses minor matrix () and cofactor () notation. Let’s take a look at how this notation can accommodate for expansion along the first column.

Let be an matrix. Define to be an matrix obtained from by deleting the first column and the row of . We say that is the -minor of .

Define to be the -cofactor of .

We are now ready to propose an alternative definition of the determinant in terms of cofactor expansion along the first column. Keep in mind that at this point we have not proved that the two definitions will always produce the same result. We will prove that the two definitions are equivalent in the next section.

Proof of Definition Equivalence

We will now show that cofactor expansion along the first row produces the same result as cofactor expansion along the first column.

Proof
We will proceed by induction on . Clearly, the result holds for . Just for practice you should also verify the equality for . (See Practice Problem prob:extrainductionsteps.) We will assume that the result holds for matrices and show that it must hold for matrices.

You will find the following matrix a useful reference as we proceed.

For convenience, we will refer to the Right Hand Side (RHS) and the Left Hand Side (LHS) of the equality we are trying to prove.

Note that the first term is the same for LHS and RHS, so we will only need to consider .

We will start by analyzing RHS. Consider an arbitrary entry of the fist column. This entry will only appear in the term . We will find by cofactor expansion along the first row. As we proceed, we have to pay special attention to the subscripts. Because the first column of was removed, the column of contains the column of .

Note that the entry will only appear in the term So, after we distribute , RHS will contain only one term of the form We will perform a similar analysis on LHS. Consider an arbitrary entry of the fist row. This entry will only appear in the term . Invoking the induction hypothesis, we will find by cofactor expansion along the first column.

The entry will only appear in the term So, after we distribute , LHS will contain only one term of the form But RHS also has only one term of this form. We now need to show that these two terms are equal. The two terms are

and

Observe that and are the same matrix because both were obtained from matrix by deleting the first and the rows of , and the first and the columns of . Therefore .

We conclude that the terms of LHS and RHS match. This establishes the desired equality.

Now we know that cofactor expansion along the first row and cofactor expansion along the first column produce the same result, so either expansion can be used to find the determinant.

Practice Problems

Compute the determinant of each matrix by cofactor expansion along the first row and by cofactor expansion along the first column. Compare the results.
Answer:
Answer:
Let Find such that .

Answer:

Let
(a)
Find and .

Answer:

(b)
Formulate a conjecture about the relationship between the determinant of a matrix and the determinant of its transpose.
(c)
Prove your conjecture.
A matrix is called upper triangular if all of its entries below the main diagonal are 0. A matrix is called lower triangular if all of its entries above the main diagonal are 0. Upper and lower triangular matrices are collectively referred to as triangular matrices.

Use mathematical induction to prove that the determinant of a triangular matrix is equal to the product of its diagonal entries.

(Note: We will make use of this property extensively in Module DET-0030.)

Prove that Definition def:toprowexpansion of DET-0010 and Definition def:firstcolexpansion1 give the same result for the determinant of and matrices. (This problem is referenced in Theorem th:rowcolexpequivalence.)