Row and column operations can be performed using matrix multiplication.

As we have seen, systems of equations—or equivalently matrix equations—are solved by i) forming the ACM associated with the set of equations and ii) applying row operations to the ACM until it is in reduced row echelon form.

It turns out these row operations can be realized by left multiplication by a certain type of matrix, and these matrices have uses beyond that of performing row operations. To explain how matrix multiplication comes into play, let us write for a particular row operation on matrices, so that the given operation is represented by . It turns out that for any of the three types of row operations we have considered above, one has the identity In other words, the row operation , applied to , can be realized in terms of left multiplication by the matrix gotten by applying to the identity matrix.

As one would then expect, one has - for each row operation - a corresponding elementary matrix derived from the identity matrix of the appropriate dimension by application of that given operation.

These matrices, and the notation used to define them, can be recorded in an expanded version of the table above in which we indicated the types of operations and their representation:









Type What it does Indicated by Elementary matrix




Switches
Type I and
rows




Multiplies
Type II row
by




Adds
Type III times the row added to
to the row








In an analogous fashion, one can also perform column operations on a matrix. As with row operations there are three types, and each type can be achieved via right multiplication with the corresponding matrix. In other words, if indicates the given column operation, then for each type one has Denoting the column by , we have









Type What it does Indicated by Elementary matrix




Switches
Type I and
colmns




Multiplies
Type II column
by




Adds
Type III times the column added to
to the column








Each elementary matrix is invertible, and of the same type. The following indicates how each elementary matrix behaves under i) inversion and ii) transposition:

Elementary matrices are useful in problems where one wants to express the inverse of a matrix explicitly as a product of elementary matrices. We have already seen that a square matrix is invertible iff is is row equivalent to the identity matrix. By keeping track of the row operations used and then realizing them in terms of left multiplication by elementary matrices, we can write down a product of matrices ending in that equals . The product of elementary matrices appearing to the left of must then be equal to . With such an expression for we can also represent itself as a product of elementray matrices.

Suppose . Find a sequence of row operations that row reduce to . Then use this to express both an as a product of elementary matrices.