Into the Kiln: Insights from Tao's 'Analysis I'

by TurnTrout 1y1st Jun 20188 comments


Note: real analysis is not on the MIRI reading list (although I think it should be).


As a young boy, mathematics captivated me.

In elementary school, I'd happily while away entire weekends working through the next grade's math book. I was impatient.

In middle school, I'd lazily estimate angles of incidence that would result if I shot lasers from my eyes, tracing their trajectories within the classroom and out down the hallway. I was restless.

In high school, I'd daydream about what would happen to integrals as I twisted functions in my mind. I was curious.

And now, I get to see how it's all put together. Imagine being fascinated by some thing, continually glimpsing beautiful new facets and sampling exotic flavors, yet being resigned to not truly pursuing this passion. After all, I chose to earn a computer science degree.


Analysis I

As in Linear Algebra Done Right, I completed every single exercise in the book - this time, without looking up any solutions (although I did occasionally ask questions on Discord). Instead, I came back to problems if I couldn't solve them after half an hour of effort.

A sampling of my proofs can be found here.

1: Introduction

2: The Natural Numbers

In which the Peano axioms are introduced, allowing us to define addition and multiplication on the natural numbers .

3: Set Theory

In which functions and Cartesian products are defined, among other concepts.

Recursive Nesting

How can you apply the axiom of foundation if sets are nested in each other? That is, how can the axiom of foundation "reach into" sets like and ?

Show that if and are two sets, then either or (or both).

Proof. Suppose and . By the pairwise axiom, we know that there exists a set . We see that there does not exist an such that . That is, if we choose , one of its elements is , which is also an element of - this violates the axiom of foundation. The same reasoning applies if we choose . Then , so either or (or both) is not an element of the other.

4: Integers and Rationals

In which the titular sets are constructed, allowing the exploration of absolute value, exponentiation, and the incompleteness (colloquially, the "gaps") of the rationals.

Readers newer to mathematics may find it interesting that even though there are (countably) infinitely many rational numbers between any two distinct rationals, the rationals still contain gaps.

5: The Real Numbers

In which Cauchy sequences allow us to formally construct the reals.

6: Limits of Sequences

In which we meet convergence and its lovely limit laws, extend the reals to cover infinities, experience the delightfully-counterintuitive and , and complete our definition of real exponentiation.

Upper-Bounded Monotonic Sequence Convergence

I tried to come up with a clever title here - I really did. Apparently even my punmaking abilities are bounded.

Suppose you have a monotonically increasing sequence with an upper bound . Then the sequence converges; this also applies to lower-bounded monotonic decreasing sequences.

Weird, right? I mean, even though the sequence monotonically increases and there's an upper bound, there are still uncountably infinitely many "places" the sequence can "choose" to go. So what gives?

Proof. Let and . Suppose that the sequence is not eventually -close to . Let be such that for all , either or ; we know that exists because the sequence is monotone increasing. By the Archimedean principle, there exists some such that .

Since the sequence is monotone increasing, by repeating the above argument times in the first case, we have that , which is contradictory. By repeating the argument times in the second case, we have , which contradicts the fact that the sequence is monotone increasing. Then for any , the sequence must be eventually -close to some . Intuitively, for any given , the sequence can only "escape" a limit a finite number of times before it runs out of room and has to be -close.

Next, we show that the 's form a Cauchy sequence. Let , and set such that . is eventually -close to , so there exists a such that for all we have . Similar arguments hold for . Set , now . But is arbitrary, so we can easily see that the sequence is Cauchy.

As the real numbers are complete, this sequence converges to some . Since the main sequence is eventually -close to , and converges to , by the triangle inequality we have that the main sequence converges to .

7: Series

In which we finally reach concepts from pre-calculus.


The Cauchy condensation test says that for a sequence where and for all , the series converges iff the series converges. Using this result, we have that the harmonic series diverges; the partial sums are given below.

What was initially counterintuitive is that even though , the series doesn't converge. The best intuition I've come up with is that the harmonic series doesn't "deplete" its domain quickly enough, so you can get arbitrarily large partial sums.

If you want proofs, here are twenty!

8: Infinite Sets

In which uncountable sets, the axiom of choice, and ordered sets brighten our lives.

9: Continuous Functions on

In which continuity, the maximum principle, and the intermediate value theorem make their debut.

Lipschitz Continuity Uniform Continuity

If a function () is Lipschitz-continuous for some Lipschitz constant , then by definition we have t for every ,

The definition of uniform continuity is

For every , there exists a such that for all such that , .

Lipschitz continuity implies uniform continuity (do you see why?), but the converse is not true. I mean, what kind of twisted person would come up with this kind of function?

10: Differentiation of Functions

In which the basic rules of differential calculus are proven.

You know, I actually thought that I wouldn't have too much to explain in this post - the book went very smoothly up to this point. On the upside, we get to spend even more time together!

Differential Intuitions

Let me simply direct you to this excellent StackExchange answer.


We can understand by simply thinking about , which makes sense for the derivative of the inverse!

L'Hôpital's Rule

Consider differentiable on (for real numbers ). Then if