Instructor 
Graham Denham 
Office hours 
Monday, Friday 10:3011:20am, or by appointment

Class times 
MWF 9:3010:30am

Class location 
MC 108

Textbook 
Linear Algebra, Friedberg Insel and Spence, 5th edition, available at the bookstore

Prerequisites 
Mathematics 2120A/B, or permission of the Department. 
Midterm exam date

26 February, 910:20am, Talbot College 341

Final exam 
April 23rd, 7pm9:30pm.

Evaluation 
40% 20% Final exam; 35% 50% midterm; 25% 30% assignments

Synopsis
A weekbyweek record of what's going on will appear here. Note that this
is a course where class attendance and participation are generally expected,
but if you miss a day, you can get some idea of what took place here.
I will sometimes include an exercise that's supposed to help you think about
the day's class.
 January 6: introduction, outline of course
objectives. Important advice: attend class!
 January 8: review 1: linear transformations and
change of basis. Assignment 1 due January 20th.
 January 10: review 2: vector spaces: dual spaces and
direct sums. Exercise: show that the evaluation map
\(\Psi\colon V\to (V^*)^*\) is injective.
 January 13: quotient spaces. Eigenvectors and eigenvalues of a
linear transformation. Exercise: if \(W\) is a subspace of \(V\),
show that the map \(V\to V/W\) that sends \(v\) to \(v+W\) is a
linear transformation. What's its kernel?
 January 15: Diagonalization I: a review of what to do with a
square matrix. We defined algebraic and geometric multiplicity of an
eigenvalue, and proved that the first is greater than or equal to the
second.
 January 17: Diagonalization II: we characterized diagonalizability
by showing that, if the characteristic polynomial splits and the
multiplicities above are equal for each eigenvalue, then there exists
a basis for the whole space that restricts to a basis for each eigenspace.
 January 20: *no class*
 January 22, 24: The eigenvectors of complex multiplication. Invariant
subspaces and cyclic subspaces.
 January 27: We saw that if \(k\) is the least positive integer for which
\(\{v,Tv,T^2v,\ldots,T^kv\}\) are linearly dependent, then the dependency
gives the coefficients of the characteristic polynomial of \(T\), restricted
to the cyclic subspace generated by \(v\). Exercise: let \(R\) be a
reflection in \({\mathbb R}^3\). Find all the \(R\)invariant subspaces,
and all the \(R\)cyclic subspaces. What's the characteristic polynomial
of \(R\)?
 January 29: The CayleyHamilton Theorem. A typo from class:
differentiation \(D\colon V_d\to V_d\) on polynomials of degree at most
\(d\) satisfies the relation \[D^{d+1}t^d=0,\] (not whatever I said.)
So \(x^{d+1}\) divides the characteristic polynomial \(f_D(x)\).
Since \(V_d\) has dimension \(d+1\), the polynomial has degree \(d+1\),
and we can conclude that \(f_D(x)=(x)^{d+1}.\) Exercise: if a linear
operator satisfies \(T^2=I\) and it isn't the identity, what are its
possible characteristic and minimal polynomials? Another exercise:
define \(T\colon {\mathbb R}[t]\to {\mathbb R}[t]\) by letting
\[ T(f(t)) = \int_0^t f(x)dx.\] What can you say about the \(T\)cyclic
subspace generated by a nonzero, constant function?
 January 31: The minimal polynomial. We showed that if \(T\colon V\to
V\) is a linear operator and \(V\) is finitedimensional, then its
minimal polynomial divides any other polynomial \(p(t)\) for which
\(p(T)=T_0\). We also showed that each eigenvalue of \(T\) is a root
of \(p(t)\). So if the characteristic polynomial \(f_T(t)\)
splits into squarefree factors, it must be the case that \(f_T(t)=(1)^n
g(t)\), where \(n=\dim V\).
 February 3: if the characteristic polynomial splits, the
\(p(t)\) is squarefree if and only if the operator is diagonalizable.
Lots of examples. Review of partitions. Nilpotent operators.
Exercise: find the minimal and
characteristic polynomials of \(\partial/\partial x\) and of
\(\partial/\partial y\), regarded as linear operators on a subspace of
real polynomials in the variables \(x,y\). Some subspaces that make
this question
interesting are given by taking polynomials with bounds on any of the
\(x\)degree, the \(y\)degree, or the total degree. Try some small
examples  can you see some patterns? Study
assignment: please review direct sums.
 5 February: let's classify nilpotent operators. Main idea: if
\(T\colon V \to V\) is nilpotent, we can find a basis for \(V\)
consisting of a bunch of \(T\)cyclic sets.
 7 February: we find that a nilpotent operator on a \(n\)dimensional
space has a Jordan type, which is a partition of \(n\). The dimension
of the null space is the number of parts, and the minimal polynomial
is \(t^k\), where \(k\) is the size of the largest part.
 10 February: the Jordan canonical form I: generalized eigenspaces.
Operators with characteristic polynomial \(f_T(t)=(\lambdat)^n\) have
a Jordan basis. Examples suggest what happens in the general case.
For fun(?): let \[T(f(x,y)) = f(x,y)+\partial_x f(x,y)+\partial_y f(x,y)\]
be a linear operator on polynomials in two variables of degree at most 2.
What are the generalized eigenspaces?
 12 February: more than one eigenvalue. We showed that, if \(T\) has
eigenvalues \(\lambda_1,\ldots,\lambda_m\), then \[V=K_{\lambda_1}\oplus
K_{\lambda_2}\oplus\cdots\oplus K_{\lambda_m}.\] In particular, this
means that the dimension of the generalized eigenspace for an eigenvalue
\(\lambda\) is equal to \(\lambda\)'s algebraic multiplicity.

14 February: computing the Jordan Canonical Form in general. Examples.
Two matrices (with split characteristic polynomials) are similar if and
only if they have the same Jordan Canonical Form.
 1721 February: reading week. 24 February: review. 26 February: midterm! 28 February: no class.
 24 March: Inner products and norms. Remembering the desirable properties
of the familiar dot product, we define inner products in general.
Exercise: the definition only required linearity on the lefthand side.
Check that, with properties 1 to 3, you can prove that, if
\(\left<\,,\,\right>\) is an inner product on \(V\), then also
\[ \left\lt u,v+w\right\gt =\left\lt u,v\right\gt +
\left\lt u,w\right\gt \]
for all vectors \(u,v,w\in V\), and
\[ \left\lt u,cv\right\gt = \overline{c}\left\lt u,v\right\gt\]
for all scalars \(c\) and vectors \(u,v\in V\).
 6 March: Orthogonality and the GramSchmidt process. Examples:
\(\{f_n=e^{i n t}\colon n\in{\mathbb Z}\}\), the Hermite polynomials, and
an orthonormal basis for \(2\times2\) matrices with respect to the
Frobenius inner product. Fourier coefficients.
 9 March: Orthogonal complements: if \(W\) is a subspace of a
finitedimensional inner product space \(V\), then
\[V\cong W\oplus W^{\perp}.\] An application: Bessell's inequality.
Quick exercise: check directly that \(W\cap W^\perp=\{0\}\).
 11 March: the adjoint of a linear transformation. Here's the
highlevel summary. If \(T\colon V\to W\) is a linear transformation
between finitedimensional inner product spaces, there is a unique
transformation \(T^*\colon W\to V\) satisfying
\[\left\lt T(v),w\right\gt_W=\left\lt v,T^*(w)\right\gt_V\]
for all \(v\in V, w\in W\). To make it, check that an inner
product gives isomorphisms \(V\cong V^*\) and \(W\cong W^*\): use
them, together with the dual map \(T^\vee\colon W^*\to V^*\).
Watch out for the notation! It seems "\(\cdot^*\)" can mean dual
space, adjoint, or conjugatetranspose, depending on where it
appears. The matrix of \(T^*\) with respect to an orthonormal basis
is just the (conjugate) transpose of that of \(T\).
 13, 16 March: no class 😢
 18 March: An application: approximate solutions to overdetermined
systems of linear equations. If a matrix equation \(Ax=y\) doesn't have
a solution, consider orthogonal projection of \(y\) onto the column
space of \(A\). This gives you a vector of the form \(y_0=Ax_0\),
minimizing the distance \(y_0y\). (Exercise: why is that?)
Once you believe that this is what you want to do, though, it isn't
too hard to calculate that \(x_0=(A^*A)^{1}A^*y,\) noting that
the square matrix \(A^*A\) is invertible by our rank calculation from
last week. You can obtain the vector \(y_0\) by multiplying both
sides by \(A\). Check out the textbook for the example from class,
fitting a line through points in the plane.
Next up: if a linear operator \(T\colon V\to V\) has \(\lambda\) as an
eigenvalue, then \(T^*\) has \(\overline{\lambda}\) as an eigenvalue.
 20 March: struggling a bit with the technology, I showed that,
if the characteristic polynomial of \(T\) splits, then there exists
an orthonormal basis \(\beta\) for \(V\) with the property that
\([T]_{\beta}\) is an uppertriangular matrix. We defined an operator
\(T\) to be selfadjoint if \(T=T^*\), and
normal if \(TT^*=T^*T\). Examples: selfadjoint operators are
normal, of course, but not all normal operators are selfadjoint.
 23 March: more examples of normal operators. A real matrix \(A\) is
orthogonal if \(A^t=A^{1}\). (E.g., the matrices for
rotation in the plane are orthogonal.) To say that \(A^tA=I_n\) just
means the columns (or, equivalently, the rows) of \(A\) form an
orthonormal basis for \(F^n\). The complex version of this: a complex
matrix \(A\) is said to be unitary if \(U^*=U^{1}.\) An
operator \(T\) is orthogonal (or unitary) if \([T]_{\beta}\) is
an orthogonal (or unitary) matrix. Now things
start to get interesting.
Theorem: If \(T\) is a normal operator and its characteristic polynomial
splits, it has an orthonormal basis of eigenvectors, and conversely.
We looked at examples in both the real and complex cases. Something
more special happens for real matrices:
Theorem: If \(V\) is a (finitedimensional) real inner product space,
an operator \(T\colon V\to V\) has an orthonormal basis of
eigenvectors if and only if \(T\) is selfadjoint.
This means that real, symmetric matrices are always diagonalizable,
and you can find orthonormal bases of eigenvectors for them!
See the OWL forums for some key examples.
 25 March: orthogonal and unitary operators. They are
characterized by preserving lengths and inner products.
Their eigenvalues have modulus 1. Examples.
 27 March: more about orthogonal and unitary operators.
Unitary operators are closed under composition and inverse: that
is, they form a group. On the other hand, Hermitian
(/selfadjoint) operators form a vector space, but they're not
closed under composition.
 29 March: the spectral theorem. First, though, we define
orthogonal projections as operators \(T\colon V\to V\) whose
image and nullspace are mutually orthogonal. Then we show
this is equivalent to having the properties that \(T=T^2\)
and \(T=T^*\). The first property is sometimes called
idempotence. So orthogonal projections are the same
as selfadjoint, idempotent operators.
 1 April: proof of the spectral theorem. The matrix version
goes like this: suppose \(A\) is a \(n\times n\) normal matrix (if \(F={\mathbb C}\)) or a symmetric matrix (if \(F={\mathbb R}\)). Then there is a
unitary matrix \(Q\) and a diagonal matrix of eigenvalues
\(D\) for which \[ A = QDQ^*.\]
Moreover, if we write \(Q=\big(A_1\ldotsA_k\big)\) where the columns of
\(A_i\) all correspond to an eigenvalue \(\lambda_i\), for
\(1\leq i\leq k\), the matrices \(A_iA_i^*\) are the orthogonal
projections onto the eigenspaces.
Then \[ A = \sum_{i=1}^k \lambda_i A_iA_i^*\quad\text{and}\quad
I_n = \sum_{i=1}^k A_iA_i^*.\]
Syllabus
From the academic calendar: "A continuation of the material of Mathematics 2120A/B including properties of complex numbers and the principal axis theorem; singular value decomposition; linear groups; similarity; Jordan canonical form; CayleyHamilton theorem; bilinear forms; Sylvester's theorem."
Less formally, Math 2120 sets the stage for linear algebra by introducing
vector spaces, bases, and linear transformations. Math 3121 continues with
a range of fun topics that all continue from that foundation. Some of these
are of great practical use in applications, like the singular value
decomposition. Others, like the study of bilinear forms, play a basic role
in geometry and physics.
Assignments
Linear algebra is a skill to develop and practice is essential.
Homework assignments, approximately
biweekly, will be the most important part of the course. You are encouraged
to take them seriously and budget at least three hours per week for homework.
Assignments are here.
Assignments will be submitted through
Gradescope.
Some of the assignment problems will be routine, and some will take
some
thought. Collaborating with other people can add a lot to the
experience of doing math, and I encourage you to do so.
Just make sure to write
your
own solutions, your own way, and to acknowledge any debts you may
have. Ask me if in doubt, since presenting the work of others as
your own constitutes a serious academic offence.
There will be at most six assignments, approximately biweekly. If you submit
all of them, I will drop your lowest homework score.
Exams
There was a midterm on February 26th. The final exam will be a takehome
exam, available at noon on April 23rd for 24 hours. Here are some
practice problems to help with your review.
And some
solutions to go with them.
COVID19 update
Classes move online starting on March 18th. The final exam will be a takehome
exam which I'll ask you to complete and return via Gradescope, on or around
April 23rd.
Math 9050b
The MSc version of this course includes slightly
different homework problems, and an additional selfdirected
written
project,
to be chosen at the start of term. In this case, the evaluation is
weighted
as
30% 20% final exam;
25% 30% midterm;
25% 30% assignments; 20% project. The
project is due on the first Monday after the last lecture.
Further information
Academic dishonesty:
Scholastic offences are taken seriously and students are directed to read the official policy.
Accessibility Statement:
Please contact the course instructor if you require material in an
alternate format or if you require any other arrangements to make this
course
more accessible to you. You may also wish to contact Services for
Students with Disabilities (SSD) at 6612111 ext. 82147
for any specific question regarding an accommodation.
Support Services:
Learningskills counsellors at the Student Development Centre
are ready to help you improve your learning skills.
Students who are in emotional/mental distress should refer to
Mental Health@Western
for a complete list of options about how to obtain help.
Additional studentrun support services are offered by the
USC.
The website for Registrarial Services is
http://www.registrar.uwo.ca.
Eligibility:
You are responsible for ensuring that you have successfully completed
all course prerequisites and that you have not taken an antirequisite
course.
Unless you have either the requisites for this course or written special permission
from your Dean to enroll in it, you may be removed from this course and it will be
deleted from your record. This decision may not be appealed. You will receive
no adjustment to your fees in the event that you are dropped from a course for
failing to have the necessary prerequisites.
Common Course Policies and information:
please click here.