Linear Systems

1Overview on linear systems

Our journey through linear algebra begins with linear systems.

2Row Reduction

We row reduce a matrix by performing row operations, in order to find a simpler but equivalent system for which the solution set is easily read off.

2.1Plan for Row Reduction

The operations used to perform row reduction are called row operations.

2.2Notation for Row Operations

We summarize the notation to keep track of the precise row operations being used.

2.3Algorithm for Row Reduction

We summarize the algorithm for performing row reduction.

3Matrices

A matrix is a rectangular array whose entries are of the same type.

3.1Matrix Operations and Matrix Algebra

Matrix algebra uses three different types of operations.

3.2Matrix Equations

Matrices and vectors can be used to rewrite systems of equations as a single equation, and there are advantages to doing this.

3.3The Superposition Principle

Sums of solution to homogeneous systems are also solutions.

3.4Elementary Matrices

Row and column operations can be performed using matrix multiplication.

Vector Spaces and Linear Transformations

4Vector Spaces

4.1The vector space ℝn

We begin our introduction to vector spaces with the concrete example of .

4.2Definition of a vector space

A vector space is a set equipped with two operations, vector addition and scalar multiplication, satisfying certain properties.

4.3Subspaces

A subset of a vector space is a subspace if it is non-empty and, using the restriction to the subset of the sum and scalar product operations, the subset satisfies the axioms of a vector space.

4.4Linear combinations and linear independence

A linear combination is a sum of scalar multiples of vectors.

5Constructing and Describing Vector Spaces and Subspaces

5.1Spanning sets, row spaces, and column spaces

A collection of vectors spans a set if every vector in the set can be expressed as a linear combination of the vectors in the collection. The set of rows or columns of a matrix are spanning sets for the row and column space of the matrix.

5.2Nullspaces

Nullspaces provide an important way of constructing subspaces of .

5.3Range

Another subspace associated to a matrix is its range.

5.4Bases and dimension

A basis is a collection of vectors which consists of enough vectors to span the space, but few enough vectors that they remain linearly independent. It is the same as a minimal spanning set.

5.5Coordinate systems

An array of numbers can be used to represent an element of a vector space.

5.6Vector spaces over ℂ

To complete this section we extend our set of scalars from real numbers to complex numbers.

6Linear Transformations

6.1Definition

A linear transformation is a function between vector spaces preserving the structure of the vector spaces.

6.2Matrix representations of transformations

A linear transformation can be represented in terms of multiplication by a matrix.

6.3Change of basis

Determine how the matrix representation depends on a choice of basis.

6.4Vector spaces of linear transformations

The collection of all linear transformations between given vector spaces itself forms a vector space.

Eigenvalues and Eigenvectors

7The Determinant

The determinant summarizes how much a linear transformation, from a vector space to itself, “stretches” its input.

7.1Cofactor expansion

One method for computing the determinant is called cofactor expansion.

7.2Combinatorial definition

There is also a combinatorial approach to the computation of the determinant.

7.3Properties of the determinant

The determinant is connected to many of the key ideas in linear algebra.

8Eigenvalues and Eigenvectors

8.1Definition

A nonzero vector which is scaled by a linear transformation is an eigenvector for that transformation.

8.2Eigenspaces

The span of the eigenvectors associated with a fixed eigenvalue define the eigenspace corresponding to that eigenvalue.

8.3The characteristic polynomial

Establish algebraic criteria for determining exactly when a real number can occur as an eigenvalue of .

8.4Direct sum decomposition

The subspace spanned by the eigenvectors of a matrix, or a linear transformation, can be expressed as a direct sum of eigenspaces.

9Properties of Eigenvalues and Eigenvectors

9.1Similarity and diagonalization

Similarity represents an important equivalence relation on the vector space of square matrices of a given dimension.

9.2Complex eigenvalues and eigenvectors

There are advantages to working with complex numbers.

9.3Geometric versus algebraic multiplicity

There are advantages to working with complex numbers.

9.4Shur’s Theorem

There are advantages to working with complex numbers.

9.5Normal matrices

There are advantages to working with complex numbers.

9.6Generalized eigenvectors

For an complex matrix , does not necessarily have a basis consisting of eigenvectors of . But it will always have a basis consisting of generalized eigenvectors of .

Inner Product Spaces

10Inner Products on ℝn

10.1The dot product in ℝn

10.2Symmetric bilinear pairings on ℝn

10.3Orthogonal vectors and subspaces in ℝn

10.4Orthonormal vectors and orthogonal matrices

11Projections and Least-squares Approximations

11.1Projection onto 1-dimensional subspaces

11.2Gram-Schmidt orthogonalization

11.3Least-squares approximations

11.4Least-squares solutions and the Fundamental Subspaces theorem

11.5Applications of least-squares solutions

11.6Projection onto a subspace

11.7Polynomial data fitting

12Complex inner product spaces

12.1The complex scalar product in ℂn

12.2Conjugate-symmetric sesquilinear pairings on ℂn, and their representation

12.3Unitary matrices

Singular Values

12.4Singular value decomposition

The singular value decomposition is a genearlization of Shur’s identity for normal matrices.

You can download a Certificate as a record of your successes.