We interpret linear systems as matrix equations and as equations involving linear combinations of vectors. We define singular and nonsingular matrices.

MAT-0030: Linear Systems as Matrix and Linear Combination Equations

A Linear System as a Matrix Equation

Consider the linear system Let’s construct the coefficient matrix and multiply it by on the right. Observe that each component of the product vector corresponds to one of the equations in the system. Let . Then is a matrix equation that corresponds to our system of equations.

In general, a system of linear equations

can be written as a matrix equation as follows:

Solving this matrix equation (or showing that a solution does not exist) amounts to finding the reduced row-echelon form of the augmented matrix

The solution given in (eq:generalvsparticular) is an example of a general solution because it accounts for all of the solutions to the system. Letting and take on specific values produces particular solutions. For example, is a particular solution that corresponds to , .

Singular and Nonsingular Matrices

Our examples so far involved non-square matrices. Square matrices, however, play a very important role in linear algebra. This section will focus on square matrices.

Observe that the left-hand side of the augmented matrix in Example ex:nonsingularintro is the identity matrix . This means that .

The elementary row operations that carried to were not dependent on the vector . In fact, the same row reduction process can be applied to the matrix equation for any vector to obtain a unique solution. Given a matrix such that , the system will never be inconsistent because we will never have a row like this: . Neither will we have infinitely many solutions because there will never be free variables. Matrices such as deserve special attention.

Non-singular matrices have many useful properties.

We will prove equivalence of the three statements by showing that

Proof of item:asingularitem:uniquesolution
Suppose . Given any vector in , the augmented matrix can be carried to its reduced row-echelon form . Uniqueness of the reduced row-echelon form guarantees that is the unique solution of .

Proof of item:uniquesolutionitem:onlytrivialsolution
Suppose has a unique solution for all vectors . Then has a unique solution. But is always a solution to . Therefore is the only solution.

Proof of item:onlytrivialsolutionitem:asingular
Suppose has only the trivial solution. This means that is the only solution of . But then, we know that the augmented matrix can be reduced to . The same row operations will carry to .

Not all square matrices are nonsingular. For example, By Theorem th:nonsingularequivalency1, a matrix equation involving a singular matrix cannot have a unique solution. The following example illustrates the two scenarios that arise when solving equations that involve singular matrices.

A Linear System as a Linear Combination Equation

Recall that the product of a matrix and a vector can be interpreted as a linear combination of the columns of the matrix. For example,

Practice Problems

Given a system of linear equations, write (a) the corresponding matrix equation, and (b) the corresponding linear combination equation. DO NOT SOLVE.

Use an augmented matrix and elementary row operations to find coefficients and that make the expression true, or demonstrate that such coefficients do not exist.

In each problem below determine whether vector is in the span of the given set of vectors.
and
is in the span of the vectors. is NOT in the span of the vectors.
and
is in the span of the vectors. is NOT in the span of the vectors.
and
is in the span of the vectors. is NOT in the span of the vectors.