We prove several results concerning linear independence of rows and columns of a matrix.
Recall that a matrix (or augmented matrix) is in row-echelon form if:
- All entries below each leading entry are .
- Each leading entry is in a column to the right of the leading entries in the rows above it.
- All rows of zeros, if there are any, are located below non-zero rows.
A matrix in row-echelon form is said to be in reduced row-echelon form if it has the following additional properties
- Each leading entry is a
- All entries above each leading are
Every matrix can be brought to reduced row-echelon form using the Gauss-Jordan Algorithm (See Module SYS-M-0030).
- Every nonzero row of contains a leading . All entries above and below the leading are . Thus, no nonzero row of can be written as a linear combination of the other rows. Therefore, by Theorem th:redundantifflindep of VEC-M-0100, the nonzero rows of are linearly independent.
- See Practice Problem prob:proofofrowsofreflinind
- See Practice Problem prob:proofofpivotcolslinind.
- Columns of are linearly independent if and only if .
- Rows of are linearly independent if and only if .
- See Practice Problem prob:proofoflinindandrank.
Recall that a square matrix is said to be nonsingular provided that its reduced row-echelon form is equal to the identity matrix. In MAT-M-0050 we showed that a square matrix is nonsingular if and only if it is invertible. (See Corollary cor:rrefI of MAT-M-0050)
- By Theorem th:linindandrank, a square matrix has linearly independent columns and linearly independent rows if and only if its rank is equal to the number of columns (rows). This means that a square matrix has linearly independent columns and linearly independent rows if and only if the matrix is nonsingular.