The purpose of this appendix is to verify the inductive definition of determinant (??). We have already shown that if a determinant function exists, then it is unique. We also know that the determinant function exists for matrices. So we assume by induction that the determinant function exists for matrices and prove that the inductive definition gives a determinant function for matrices.

Recall that is the cofactor matrix obtained from by deleting the row and column — so is an matrix. The inductive definition is: We use the notation to remind us that we have not yet verified that this definition satisfies properties (a)-(c) of Definition ??. In this appendix we verify these properties after assuming that the inductive definition satisfies properties (a)-(c) for matrices. For emphasis, we use the notation to indicate the determinant of square matrices of size less than .

Property (a) is easily verified for since if is lower triangular, then by induction.

Before verifying that satisfies properties (b) and (c) of a determinant, we prove:

Proof
We verify (??) for each of the three types of elementary row operations.

(I) Suppose that multiplies the row by a nonzero scalar . If , then the cofactor matrix is obtained from the cofactor matrix by multiplying the row by . By induction, and . On the other hand, . So (??) is verified in this instance. If , then the row of is from which it is easy to verify (??).

(II) Next suppose that adds a multiple of the row to the row. We note that . When then by induction. When then . But is strictly upper triangular and . Thus .

If and , then the result follows by induction.

If , then

where the and row of are equal.

If , then

where the and row of are equal.

The hardest part of this proof is a calculation that shows that if the and rows of are equal, then . By induction, we can swap the row with the . Hence we need only verify this fact when .

(III) is the matrix that swaps two rows.

As we saw earlier (??), is the product of four matrices of types (I) and (II). It follows that and .

We now verify that when the and rows of an matrix are equal, then . This is a tedious calculation that requires some facility with indexes and summations. Rather than do this proof for general , we present the proof for . This case contains all of the ideas of the general proof.

We begin with the definition of

Next we expand each of the four matrices along their rows, obtaining
Combining the determinants leads to:

Supposing that it is now easy to check that .

We now return to verifying that satisfies properties (b) and (c) of being a determinant. We begin by showing that if has a row that is identically zero. Suppose that the zero row is the row and let be the matrix that multiplies the row of by . Then . Using (??) we see that which implies that since is arbitrary.

Next we prove that when is singular. Using row reduction we can write where the are elementary row matrices and is in reduced echelon form. Since is singular, the last row of is identically zero. Hence and (??) implies that .

We now verify property (b). Suppose that is singular; we show that . Since the row rank of equals the column rank of , it follows that is singular when is singular. Next assume that is nonsingular. Then is row equivalent to and we can write

where the are elementary row matrices. Since and , property (b) follows.

We now verify property (c): . Recall that is singular if and only if there exists a nonzero vector such that . Now if is singular, then so is . Therefore is also singular. To verify this point, let be the nonzero vector such that . Then . Thus is singular since is singular. Thus when is singular. Suppose now that is nonsingular. It follows that Using (??) we see that as desired. We have now completed the proof that a determinant function exists.