Fundamental physical constants are physical constants where the constant value seems unexplainable in terms of more basic physics. For example we have no idea why the relative masses of the various elementary particles are what they are - we see no reason why a muon is 206.768... times larger than an electron instead of 17.1328... times larger or 2035.97... times larger.

My initial reaction to learning about fundamental physical constants is to assume they are either not truly fundamental, and are the outcomes of more fundamental physics we don't know yet, or that they are completely arbitrary random numbers.

If you were to tell me that the fine structure constant was exactly equal to , but not for any particular reason, it just happened to be that way, I would say you were barking mad.


A computable number is a number that can be computed to an arbitrary precision by a terminating algorithm. So there exists a Turing machine, which reads as input the desired precision you want, and outputs the binary representation of the number up to that number of digits.

There are countably infinite Turing machines, and uncountably infinite real numbers. This means almost all reals are uncomputable. In fact most real numbers aren't even definable! I would give you an example of an undefinable number right now, but unfortunately I can't...


Kolmogorov complexity is a way of defining the complexity of an object.

It asks what is the shortest input to a universal Turing machine that would produce the object as an output.

It can be used to calculate the complexity of the universe in terms of how long would the input to a universal Turing machine be that simulated the universe perfectly accurately.


Solomonoff induction is a formalization of Occams razor. Essentially it says that you should assume the universe is the one with the shortest possible Kolmogorov complexity that accurately predicts the observations you see.

The Kolmogorov complexity of the universe is equivalent to the shortest program (however inefficient) that can simulate the fundamental physical rules of the universe, given as inputs the fundamental physical constants, and the starting state if the universe.

The Kolmogorov complexity of a non-computable number is infinity (by definition). In fact the Kolmogorov complexity of an arbitrarily chosen computable number is massive.

If we knew the fundamental physical rules of the universe, and knew that the fundamental physical constants were completely independent of those rules, then the chance that the fine structure constant just happened to be exactly equal to  is hugely greater than the chance it happened to be 1/137.035999046363458... (goes on for another hundred digits before recurring), since the Kolmogorov complexity of the first is far smaller than the second. And the chance it's a completely arbitrary number is precisely 0.


Is this reasoning valid?

Well it depends on why you accept Solomonoff induction. If you believe that it's a technique which predicts the universe as best as is possible given the number of bits of information it has, then maybe not - a computable real can come arbitrarily close to an uncomputable real, so Solomonoff induction will indeed simulate your uncomputable universe pretty much as perfectly as is possible.

But maybe you believe that Solomonoff induction works because the universe is actually running an algorithm on some sort of computer?

Then I guess the question comes down to whether the computer it's running on is a universal Turing machine, or some more powerful abstraction like a real computer, which can work with infinite precision arithmetic. If the latter, the true version of Solomonoff induction would talk about Kolmogorov complexity as defined in terms of real computers, not universal Turing machines.

In our universe real computers are probably impossible, but I don't think we have any idea what the aliens simulating the universe are capable of. So maybe my intuitions are right and the fundamental physical constants are completely arbitrary after all!

New Comment
6 comments, sorted by Click to highlight new comments since:
[-]ike20

On my metaphysics it's not coherent to talk about "fundamental" constants, for multiple reasons. Try tabooing that and ask about what, if anything, is actually meant.

If you can't measure any of these constants past a hundred significant digits, what does it mean to talk about the constant having any digits beyond that? And what does it mean for a constant to be fundamental?

Fundamental physical constants are easy. Consider the shortest algorithm that simulates the universe perfectly. That algorithm will consist of some rules, and some data. The data are fundamental physical constants.

If you can't measure any of these constants past a hundred significant digits, what does it mean to talk about the constant having any digits beyond that.

Assuming that the way the universe looks changes continuously with these constants, it seems strange to insist that if the changes are so small you can't notice them they don't exist. The aliens running the universe might well be able to read off all infinity digits of these constants, and measure precisely what difference changing the nth digit will make for all n.

[-]ike20

Consider the shortest algorithm that simulates the universe perfectly.

Meaningless, on my metaphysics. Definition is circular - in order to define fundamental you have to already assume that the universe can be simulated "perfectly", but to define a perfect simulation you'll need to rely on concepts like "fundamental", or "external reality".

Assuming that the way the universe looks changes continuously with these constants, it seems strange to insist that if the changes are so small you can't notice them they don't exist.

The assumption is meaningless. It seems strange to me to insist that something "exists", especially infinities, which are never observable. On our actual known physics we have strict limits on how much information can be contained in a finite amount of space, but even if we didn't know that you can't define these concepts in a noncircular manner.

Assuming the universe is purely physical and self-contained (there aren't things "outside" causing effects or changes), it seems clear that no subset of the universe can contain all the information in the whole universe.  Ok, the trivial subset "the whole universe" can, but there's no compression or sufficiently-precise knowledge of constants and rules that perfectly describes the universe which fits inside a tiny part of the universe.

Those constants aren't artifacts of the universe, they ARE the universe.  We can learn and encode them to an arbitrary precision, without ever exactly knowing them.

[ETA it's not actually known if the universe is bigger than itself.  It could be a quine.  It could be a processing substrate that holds "temporary" information that we're experiencing which is MUCH bigger than the initial program.  I strongly suspect not, but don't know of any evidence that would point one way or the other. ]

but there's no compression or sufficiently-precise knowledge of constants and rules that perfectly describes the universe which fits inside a tiny part of the universe.
 

Why is that clear?

It's obvious you can't simulate the universe in the universe. It's not obvious you can't build the simulator, just not have the money to pay for all the ram you'll need to switch it on.

In string theory, the constants should be computable, but are different for each ground state. The fine structure constant could descend from something which in essence is a simple rational number (e.g. 1 over "the number of fuzzy points" on an "internal four-cycle" of the compact dimensions), but then modified by low-energy effects.