Over the past couple months, I gave weekly lectures on applied linear algebra. The lectures cover a grab-bag of topics which I've needed to know for my own work, but which typically either aren't covered in courses or are covered only briefly in advanced courses which use them (like e.g. quantum). The series is now complete, and recordings of all the lectures are available here.

Be warned: all of the lectures were given with zero review and minimal prep. There are errors. There are poor explanations and too few examples. There are places where I only vaguely gesture at an idea and then say to google it if and when you need it. The flip side is that you will see only things I know off the top of my head - and therefore things which I've found useful enough often enough to remember.

Outline of Topics

Lecture 1

  • Prototypical use cases of linear algebra
    • First-order approximation of systems of equations for solving or stability analysis
    • Second-order approximation of a scalar function in many dimensions for optimization or characterizing of peak/bowl shape
    • First-order approximation of a dynamical system near a steady state
    • Principal components of a covariance matrix

Lecture 2

  • Working with efficient representations of large matrices
    • Tricks for jacobian and hessian matrices
    • Prototypical API for implicit matrix representations: scipy's LinearOperator

Lecture 3

  • Suppose we look at a matrix (e.g. using pyplot.matshow()). What patterns are we most likely to see, and what can we do with them?
    • Recognizing sparse & low-rank structure
    • Interpreting sparse & low-rank structure
    • Leveraging sparse & low-rank structure

Lecture 4

  • Matrix calculus, with a focus on stability of eigendecomposition
    • Basics: tensor notation
    • Differentiating eigendecomposition
    • Instability of eigenvectors of (approximately) repeated eigenvalues

Lecture 5

  • Leveraging symmetry
    • Suppose my system is invariant under some permutation (e.g. a PDE with wraparound boundary, or exchangeable variables in a covariance matrix). How can I leverage that to more efficiently find an eigendecomposition (and invert the matrix etc)?
    • What Fourier transforms have to do with symmetry, and how to compute them quickly
  • How to represent rotations/orthogonal matrices

Lecture 6

  • Wedge products: those "dx dy dz" things in integrals
    • How to do coordinate transformations with things like "dx dy", even when embedded in a higher-dimensional space
  • Map between function operations/properties and matrix operations/properties
New Comment
7 comments, sorted by Click to highlight new comments since: Today at 9:16 PM

Do you want to get some transcripts for this? I could do it pretty cheaply using whisper and editing the maths. I think I'd charge $30/hour of work.

I already know what I say in the videos. The big question is whether anyone else wants transcripts for this, especially enough to pay for them. Peanut gallery?

Interested, but depends on the cost. If I'm the only one who wants it, I'd be willing to pay $30 to get the whole series, but probably not more. I don't know how long transcriptions usually take, but I'm guessing it'd certainly be >1h. So there'd need to be additional interest to make it worth it.

Thank you for posting this. Been 'levelling up' my maths for machine learning lately and this is just perfect.

Some suggestions:

  • Use a better marker, what you wrote on the whiteboard is almost unreadable for me.
  • Expanding on the previous point, write bigger and make better use of the space on the board.
  • If you have complex graphics, pre-make them accurately and print them out, put them on the whiteboard with weak tape. (What is the weird of bridge at the start?)
  • Use Whisper to make subtitles to help non-native speakers (as another commenter suggested).
  • Invest in a tripod to have the camera at a natural height instead of bottom to top.
  • I did not watch the lectures, this feedback is from skimming around in the first one.
  • In lecture 3, visualizing the Fourier matrix (Discrete Cosine Transform) matrix, could be really interesting.
  • Are the lectures ordered by something or following a thread? If so you should make it clear by posting an overview lecture at the start.
  • Do not wear glasses, it makes you look weird and lose connection to the viewer.
  • Thanks a lot for your effort, we really need to spread mathematical education for the benefit of the whole society.

I don't think these are necessarily bad suggestions if there were a future series. But my sense is that John did this for the people in the audience, somebody asked him to record it so he did, and now he's putting them online in case they're useful to anyone. It's very hard to make good production quality lectures, and it would have required more effort. But it sounds like John knew this and decided he would rather spend his time elsewhere, which is completely his choice to make. As written, these suggestions feel a bit pushy to me.

Yes, the tone of my comment could be improved. I appreciate him for publishing his lessons to the community and wanted to give some suggestions to improve (eventual) future ones, if he feels like the higher quality is worth the higher effort, and with no obligation. "Al caval donato non si guarda in bocca" (You should not look at the teeth of a gift horse (to learn about its age))