Linear Systems

Overview on linear systems

Our journey through linear algebra begins with linear systems.

Row Reduction

We row reduce a matrix by performing row operations, in order to find a simpler but equivalent system for which the solution set is easily read off.

Plan for Row Reduction

The operations used to perform row reduction are called row operations.

Notation for Row Operations

We summarize the notation to keep track of the precise row operations being used.

Algorithm for Row Reduction

We summarize the algorithm for performing row reduction.

Matrices

A matrix is a rectangular array whose entries are of the same type.

Matrix Operations and Matrix Algebra

Matrix algebra uses three different types of operations.

Matrix Equations

Matrices and vectors can be used to rewrite systems of equations as a single equation, and there are advantages to doing this.

The Superposition Principle

Sums of solution to homogeneous systems are also solutions.

Elementary Matrices

Row and column operations can be performed using matrix multiplication.

Vector Spaces and Linear Transformations

Vector Spaces

The vector space ℝn

We begin our introduction to vector spaces with the concrete example of .

Definition of a vector space

A vector space is a set equipped with two operations, vector addition and scalar multiplication, satisfying certain properties.

Subspaces

A subset of a vector space is a subspace if it is non-empty and, using the restriction to the subset of the sum and scalar product operations, the subset satisfies the axioms of a vector space.

Linear combinations and linear independence

A linear combination is a sum of scalar multiples of vectors.

Constructing and Describing Vector Spaces and Subspaces

Spanning sets, row spaces, and column spaces

A collection of vectors spans a set if every vector in the set can be expressed as a linear combination of the vectors in the collection. The set of rows or columns of a matrix are spanning sets for the row and column space of the matrix.

Nullspaces

Nullspaces provide an important way of constructing subspaces of .

Range

Another subspace associated to a matrix is its range.

Bases and dimension

A basis is a collection of vectors which consists of enough vectors to span the space, but few enough vectors that they remain linearly independent. It is the same as a minimal spanning set.

Coordinate systems

An array of numbers can be used to represent an element of a vector space.

Vector spaces over ℂ

To complete this section we extend our set of scalars from real numbers to complex numbers.

Linear Transformations

Definition

A linear transformation is a function between vector spaces preserving the structure of the vector spaces.

Matrix representations of transformations

A linear transformation can be represented in terms of multiplication by a matrix.

Change of basis

Determine how the matrix representation depends on a choice of basis.

Vector spaces of linear transformations

The collection of all linear transformations between given vector spaces itself forms a vector space.

Eigenvalues and Eigenvectors

The Determinant

The determinant summarizes how much a linear transformation, from a vector space to itself, “stretches” its input.

Cofactor expansion

One method for computing the determinant is called cofactor expansion.

Combinatorial definition

There is also a combinatorial approach to the computation of the determinant.

Properties of the determinant

The determinant is connected to many of the key ideas in linear algebra.

Eigenvalues and Eigenvectors

Definition

A nonzero vector which is scaled by a linear transformation is an eigenvector for that transformation.

Eigenspaces

The span of the eigenvectors associated with a fixed eigenvalue define the eigenspace corresponding to that eigenvalue.

The characteristic polynomial

Establish algebraic criteria for determining exactly when a real number can occur as an eigenvalue of .

Direct sum decomposition

The subspace spanned by the eigenvectors of a matrix, or a linear transformation, can be expressed as a direct sum of eigenspaces.

Properties of Eigenvalues and Eigenvectors

Similarity and diagonalization

Similarity represents an important equivalence relation on the vector space of square matrices of a given dimension.

Complex eigenvalues and eigenvectors

There are advantages to working with complex numbers.

Geometric versus algebraic multiplicity

There are advantages to working with complex numbers.

Shur’s Theorem

There are advantages to working with complex numbers.

Normal matrices

There are advantages to working with complex numbers.

Generalized eigenvectors

For an complex matrix , does not necessarily have a basis consisting of eigenvectors of . But it will always have a basis consisting of generalized eigenvectors of .

Inner Product Spaces

Inner Products on ℝn

The dot product in ℝn

Symmetric bilinear pairings on ℝn

Orthogonal vectors and subspaces in ℝn

Orthonormal vectors and orthogonal matrices

Projections and Least-squares Approximations

Projection onto 1-dimensional subspaces

Gram-Schmidt orthogonalization

Least-squares approximations

Least-squares solutions and the Fundamental Subspaces theorem

Applications of least-squares solutions

Projection onto a subspace

Polynomial data fitting

Complex inner product spaces

The complex scalar product in ℂn

Conjugate-symmetric sesquilinear pairings on ℂn, and their representation

Unitary matrices

Singular Values

Singular value decomposition

The singular value decomposition is a genearlization of Shur’s identity for normal matrices.

You can download a Certificate as a record of your successes.