Prompted by this article from Quanta magazine about physicist Nicolas Gisin's recent work on intuitionism and the flow of time, I have done my best to explicate the philosophical and mathematical background to this subject for a non-technical audience. I find this topic to be a fascinating intersection of philosophical questions in the foundations of mathematics on the one hand, and our best understanding of the nature of physical reality on the other. It's a wonderful example of seemingly useless speculative ideas turning out to be profoundly useful in a future context.

Much of the research for this comes from the Stanford Encyclopedia of Philosophy article on Intuitionism, but I also drew on several other sources to round out my understanding. Full disclosure, I'm a trained philosopher, and have a serious lay-person interest in the foundations of mathematics, but I'm not an expert in any of the fields involved.


The Intuitionist philosophy of mathematics was created by the mathematician Brouwer. He first articulated the ideas in his PhD dissertation in 1907 and continued to develop them for the rest of his life. In addition, many other mathematicians and philosophers developed Brouwer’s ideas, so now there are intuitionistic branches of math, logic, and philosophy of math.

Brouwer’s ideas were inspired by Kant’s theory that mathematical knowledge originates in our perception of time.

Brouwer’s first big idea is about our ability to perceive that one moment in time gives way to another, thus, we have 1 to 2, and the relationship between 1 and 2. Then in another moment we can relate this 2-moment to another moment which we’ll call 3. The key thing here is that these moments of subjective time perception and the experienced relationship of temporal flow between them are the real mathematical objects. Our symbols and language such as 1, 2, 3, +, -, etc are just ways to talk about our experiences of temporally structured moments of reflection.

So we can understand the idea of an infinity of natural numbers by thinking of them as a sequence of reflection upon temporal moments that goes on and on without end, 1, 2, 3, 4…. For Intuitionists, there are only potential infinities, not actual infinities. What that means is that we cannot regard an infinity as something that exists as a completed object (it never exists fully at any moment in time). Instead, when we talk about an infinite set, like “the set of all natural numbers” what we should mean is, “the temporal process that generates each natural number with no limit on how long it may continue to run”.

It is quite natural for me to think of this in terms of computer programming. It is easy to write a program that prints 1, 2, 3, 4, and keeps running to printing bigger and bigger numbers with no programmed stopping point. But, does this mean I’ve generated an actual infinite set of numbers? Of course not. In finite time, my computer will only ever print out a certain finite subset of the natural numbers. The process is infinite only in the sense that it has no natural stopping point (we are not considering contingent stopping points like my computer crashing or breaking down or losing power).

In contrast to Intuitionism, classical mathematics regards infinite sets as well-defined abstract objects. So “the set of all natural numbers” is a valid mathematical object that exists presently and in every moment, just as much as the numbers 1 or 2. Although this is a different philosophy, the actual mathematical implications of these philosophical differences are not vary significant when dealing with natural numbers or integers or rational numbers (fractions). This is because in each of these cases, the basic members of the infinite sets are not themselves infinite.

In the set of naturals, {1, 2, 3, …} each element can be produced through a finite (terminating) process. Similarly, every rational number can be expressed as a fraction, which is defined from integers e.g., (2 + 23 / 54). So the list of all rationals is still generated from elements we can construct in a finite amount of time. This means that for the most part, we can treat math that is concerned with whole or fractional numbers pretty much the same way whether we adopt an intuitionistic viewpoint or a classical viewpoint.

However, all that changes when we come to real numbers. In math, real numbers are used to describe things that change continuously (which includes almost everything that physics talks about). In geometry, the length of a line segment would be expressed as a real number. There is a famous legend about how Pythagoras believed that every number could be expressed as fractions of whole numbers, and when one his followers provided a demonstration that the hypotenuse of a triangle with bases of length 1 could not be expressed as a fraction, Pythagoras killed him on the spot for heresy (in modern terms we’d say the length is the square root of 2).

The classical view of real numbers is that each real number is defined by an infinite set of digits. So for example, you can represent the real number one, as 1.0000000… The zeros go on forever. Why? Because, suppose we just got lazy and said that for this real number we’re only going to write two zeros. Then we could ask, is 1.00 greater or smaller than 1.001. Now, if we don’t know what comes after the second zero we can’t answer this question. But it turns out that it is very important for the concept of a continuously changing quantity that there always be an answer to this question. A continuum requires a complete linear ordering of numbers. So therefore, in the classical view, the very nature of a real number is an infinite string of digits plus some scaling factor (i.e., where to put the decimal point).

Okay, so now you might be starting to see why here the math starts to diverge more between the classical view and the intuitionistic view. For according to the intuitionistic view a real number is not a thing, not a completed object at a given moment in time. Rather it is a process, it is an expanding series of digits. Brouwer called this a choice sequence. He imagined a free agent who picked each next number in the sequence. The next number might be picked in a lawlike way or in a non-lawlike way. For example, I might decide to write down all the digits of pi. 3.14159265358979… This is a purely law-like choice sequence. However, I also might write down the digits of pi for twenty digits and then just start picking random numbers. That is also a perfectly acceptable choice sequence to generate a real number.

However, the crucial thing here is that we never really know what the next digit in the sequence will be until after the point in time when it gets chosen. There is nothing in the definition of a real number that says that all the digits have to be generated according to a mechanical rule or algorithm. There is no way to see into the future of the process of choosing digits to know in advance how things turn out. So, for the intuitionist, mathematical reality itself is time-relative. After a digit has been chosen, the chooser can’t go back on the choice (doing so would just be to define a different real number). But before the next digit has been chosen, the reality of that digit (and all subsequent ones) is fundamentally open to multiple possibilities.

Incidentally, this also accords well with the intuitionist theory of logic, which makes mathematical truth time-relative. An intuitionist says that a statement like “A or not A” is neither true nor false at a point in time where we haven’t proven either A or not A. (In classical logic, the statement “A or not A” is logically true for any statement A, a tautology known as the law of the excluded middle.) For the intuitionist, the law of the excluded middle is invalid. Yet, "A or not A" may become true at a later point in time, once we have produced a proof of one of the two sides.

Clearly, we cannot prove facts about a real number as a certain infinite string of digits if some of those digits don’t have a determinate value at this point in time. So, the rules of intuitionistic logic and the account of the essentially temporal nature of numbers are mutually consistent. In contrast, classical mathematics (which has been used to develop all our current theories of physics) assumes that every real number possesses an infinite degree of precision at every moment in time, and this is consistent with classical logic's endorsement of the law of the excluded middle.

As a result of these differences in the nature of real numbers (and other "uncountable" elements) many of the classical theorems in the mathematical field of analysis (a.k.a., calculus) are actually false in an intuitionistic system. However, in most of these cases, there are alternative theorems that resemble the original theorems but replace claims such as “there exists such and such a real number” with “there exists arbitrarily precise approximations of such and such a real number.” Replacing the idea of quantity as an actual infinity with the idea of the same quantity as an endless process that converges towards the quantity.

That’s about as far as my understanding goes. What I took from the Quanta article is that these new physics papers are trying to replace the (mathematically) classical equations of relativity and quantum mechanics with analogous equations expressed in the language of intuitionism. What this means on a physical level is that quantities like mass, position, charge, spin, etc are never infinitely precise and they become more precise as time moves forward. Since information itself is finite in the universe, there cannot be literal infinities for any physical quantity. But as time moves forward, more information might be created at some determinate rate.

This means that the flow of time has a real (potentially measurable) physical effect. Previously, this was not thought to be the case for general relativity. Time was just a direction, no different than spatial directions. Think of a particle existing at a position in spacetime. You can describe that position with 4 numbers, X, Y, Z, and T. If any of these numbers changes the change in the number represents a change in the position of the particle (e.g., change in X = X1 - X2). This implies that the change itself, the movement or flow from one position to another is not a physical thing above and beyond the facts captured by X, Y, Z, and T. This is why Einstein’s universe is called a “Block Universe”. Every number that defines positions past, present, and future could exist eternally as a static mathematical description of the whole spacetime continuum.

However, once the equations are re-cast using intuitionistic mathematics, the change in time is more than just a change in a position coordinate. Because the change in time also increases the precision of every physical quantity, unfurling more digits across the universe. This makes time special, and gives a measurable physical meaning to the idea that time flows.

New to LessWrong?

New Comment
5 comments, sorted by Click to highlight new comments since: Today at 3:15 AM

First, intuitionist mathematics makes perfect sense to me, and has for some time. Not as a replacement for the conventional math, but as an alternative view.

Second, it has nothing to do with physics, so Gisin's musings (published work, not some popular interpretation of it: https://arxiv.org/abs/2002.01653), are guaranteed to be not a step in any progress of the understanding of physics. Sorry.

Third, both are trying to get to an important point, but tend to miss it. The point is that it takes effort, time and resources to build useful models of observations. Thus, whether the gazillionth digit of pi is even or odd is not an absolute question (hence no law of excluded middle). It takes (agent-dependent) effort to calculate it, and until you yourself ascertain which it is, if any, it's neither, not for you. You can assign probabilities to it being even and it being odd, and these probabilities might even add to one, but it is still possible that something would prevent you from ever calculating it, so all you can say is "If I ever get to calculate it, the result will be either even or odd." Note that you do not make a statement about some objective gazillionth digit of pi, only about your result of calculating it. You might make a mistake somewhere, for example. Or you might die before. So, intuitionism doesn't go far enough, because it's still trying to be "objective" while giving up most of the objectivity of the traditional mathematics.

Again, mathematical proofs are as much observations as anything else. Just because they happen in one's head or with a pencil on paper, they are still observations. Repeatable, given the right equipment (including the right brain, such as the one capable of proving some theorems), and so reliable under some conditions.

The above means that some of your example are better than others. Once you have the tools for manipulating infinite numbers, you don't have to expend a lot of effort doing so. The difficulty of calculating a far-away digit in the decimal expansion of pi has nothing to do with pi itself: you can perfectly well define it as the ratio of circumference to diameter, or as a limit of some series, or as 2*arcsin(1), or something else. You can even build a base-pi system of counting. Then you can complain how hard it is to find the umpteenth digit of decimal 1 in base pi! It doesn't mean that 1 is more complex or simpler than pi, all it means that certain calculations are harder than others, and the hardness depends on many things, including on who is doing the calculation and what tools they are using.

So the point is how hard to measure something, and that includes how much time it takes. Not any kind of correspondence with counting numbers taking longer.

Fourth, the idea that Einstein's equations are somehow unique in terms of being timeless is utterly false. Electromagnetism is often written in a covariant form as []A=J and dA=0, where A and J are spacetime quantities. Similarly, the Einstein equation can be cast as an initial value problem, with the time evolution being explicit, and it is done that way in all black hole collision simulations. Similarly, quantum mechanics can be written as a path integral, where time is just one of the variables.

So, Gisin attempts to use intuitionist math for physics are bound to be forgotten, as they add nothing to our understanding of either math or physics. Sadly, he missed the point.

Thanks for your comment. My replies are below.


"so Gisin's musings... are guaranteed to be not a step in any progress of the understanding of physics."

What is your epistemic justification for asserting such a guarantee of failure? Of course, any new speculative idea in theoretical physics is far from likely to be adopted as part of the core theory, but you are making a much stronger claim by saying that it will not even be "a step in any progress of the understanding of physics". Even ideas that are eventually rejected as false, are often useful for developing understanding. Gisin's papers ask physicists to consider their unexamined assumptions about the nature of math itself, which seems at least like a fruitful path of inquiry, even if it won't necessarily lead to any major breakthroughs.


"mathematical proofs are as much observations as anything else. Just because they happen in one's head or with a pencil on paper, they are still observations."

This reminds me of John Locke's view that mathematical truths come from observation of internal states. That is an interesting perspective, but I'm not sure it an hold up to scrutiny. The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations. Perhaps you can give an alternative account of how we evaluate evidence without pre-supposing a rational framework.


"The difficulty of calculating a far-away digit in the decimal expansion of pi has nothing to do with pi itself: you can perfectly well define it as the ratio of circumference to diameter, or as a limit of some series"

I agree with this statement. I think though it misses the point I was elaborating about Brouwer's concept of choice sequences. The issue isn't that we can't define a sequence that is equivalent to the infinite expansion of pi, I think it is rather that for any real quantity we an never be certain that it will continue to obey the lawlike expansion into the future. So the issue isn't the "difficulty of calculating a far-away digit" the issue is that no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern. No matter how many digits of pi a real number contains, the next digit might suddenly be something other than pi (in which case we would say retrospectively that the real number was never equal to pi in the first place). This is actually what we observe, if we are to say measure the ratio of a jar lid's diameter to it's circumference. The first few digits will match pi, but then as we to smaller scales it will deviate.


"...the idea that Einstein's equations are somehow unique in terms of being timeless is utterly false"

I made no claim that they are unique in this regard.

"mathematical proofs are as much observations as anything else. Just because they happen in one's head or with a pencil on paper, they are still observations."

I think this is better explained as:

We try to do math, but we can make mistakes.*

If two people evaluate an arithmetic expression the same way, but one makes a mistake, then they might get different answers.


*Other examples:

1. You can try to create a mathematical proof. But if you make a mistake, it might be wrong (even if the premises are right).

2. An incorrect proof, a typo, or something on your computer screen?

A proof might have a mistake in it and thus "be invalid". But it could also have a typo, which if corrected yields a "valid proof".

Or, the proof might not have a mistake in it - you could have misread it, and what it says is different from what you saw. (Someone can also summarize a proof badly.)

If the copy of the proof you have is different from the original errors (or changes) could have been introduced along the way.

Let me reply to the last one first :)

The Einstein equation was singled out in the Quanta magazine article. I respect the author, she wrote a lot of good articles for Quanta, but this was quite misleading.

I don't understand your second last point. Are you talking about a mathematical algorithm or about a physical measurement? " no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern" -- what pattern?

The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations.

No, we don't. And yes, they are. We start with some innate abilities of the brain, add the culture we are brought in, then develop models of empirical observations, whatever they are. 1+1=2 is an abstraction of various empirical observations, be in counting sheep or in mathematical proofs. Logic and math co-develop with increasingly complex models and increasingly non-trivial observations, there is no "we need logic and math to evaluate evidence". If you look through the history of science, math was being developed alongside physics, as one of the tools. In that sense the Noether theorem, for example, is akin to, say, a new kind of a telescope.

What is your epistemic justification for asserting such a guarantee of failure?

Because they are of the type that is "not even wrong". The standard math works just fine for both GR and QM, the two main issues are conceptual, not mathematical: How does the (nonlinear) projection postulate emerge from the linear evolution (and no, MWI is not a useful "answer", it has zero predictive power), and how do QM and GR mesh at the mesoscopic scale (i.e. what are the gravitational effects of a spatially separated entangled state?).

That’s about as far as my understanding goes.

I appreciate you explaining intuitionism. I was aware of the ideas, but getting confirmation of what the important/foundational ones were was nice. (Also, that implication about physical laws was really interesting.)


Short:

So we can understand the idea of an infinity of natural numbers by thinking of them as a sequence of reflection upon temporal moments that goes on and on without end, 1, 2, 3, 4…. For Intuitionists, there are only potential infinities, not actual infinities. What that means is that we cannot regard an infinity as something that exists as a completed object (it never exists fully at any moment in time). Instead, when we talk about an infinite set, like “the set of all natural numbers” what we should mean is, “the temporal process that generates each natural number with no limit on how long it may continue to run”.

So "infinite" just means:

  • The process does not terminate itself.
  • The list cannot fully be created, because it is generated by a process without an (inherent) end.

Long:

In finite time, my computer will only ever print out a certain finite subset of the natural numbers. The process is infinite only in the sense that it has no natural stopping point (we are not considering contingent stopping points like my computer crashing or breaking down or losing power).

Assuming your computer is normal, there is a limit to how big a number it can express, much less contain (relative to a given method). (This amount can be calculated from how much "space" there is on your computer, though if you used up all the space, it might not be able to handle displaying the number, or make it particularly easy to access.)

In contrast to Intuitionism, classical mathematics regards infinite sets as well-defined abstract objects.

With some debates concerning their relative sizes?

et, "A or not A" may becometrue at a later point in time, once we have produced a proof of one of the two sides.

Causality seems a better model, but time works well enough. Another way of handling this would be 'the truth value of "A or not A" is not defined until you choose A'.

I see the truth value of "A or not A" as 'defined in the abstract as true' as a consequence of having defined 'truth', "or", and deciding that I will give A a value of true, or a value of false. (Alternatively, I can leave it as a function.)

So, the rules of intuitionistic logic and the account of the essentially temporal nature of numbers are mutually consistent.

If one motivated (or was cherry picked to justify) the other this doesn't seem like it should be a surprise?

In contrast, classical mathematics (which has been used to develop all our current theories of physics) assumes that every real number possesses an infinite degree of precision at every moment in time, and this is consistent with classical logic's endorsement of the law of the excluded middle.

In the same sense that for in turing machine, it must either halt or not halt - whether or not we have any way of knowing that.

“there exists arbitrarily precise approximations of such and such a real number.”

This is equivalent to "we can construct as many digits of this number as we want, though it might be infinitely long after the decimal".