Most people (not all, but most) are reasonably comfortable with infinity as an ultimate (lack of) limit. For example, cosmological theories that suggest the universe is infinitely large and/or infinitely old, are not strongly disbelieved a priori.

By contrast, most people are fairly uncomfortable with *manifest* infinity, actual infinite quantities showing up in physical objects. For example, we tend to be skeptical of theories that would allow infinite amounts of matter, energy or computation in a finite volume of spacetime.

Consider the following thought experiment (I forget where I first heard it):

Aliens in a passing flying saucer offer to sell us a halting oracle. It's a black box of ordinary size and mass, galactic intellectual property law prohibits giving us an explanation of how it works and it's far beyond our ability to reverse engineer. Nonetheless, if you feed it the description of a Turing machine, it will calculate for a millisecond and then indicate whether that Turing machine halts or not. Obviously we're skeptical and exhaustive testing is impossible, but the device passes every test we can throw at it.

In practice, willingness to pay for it might be based on a belief that it will probably work for every case we are going to be in a position to care about in the near future, but do we believe the sales pitch that it is a true halting oracle, i.e. a device that performs *infinite* computation in finite spacetime? Some people would give more than 50% credence to this proposition, and some people less, but almost everyone would give it a subjective probability greater than zero.

It is worth noting that Solomonoff induction would do otherwise. SI is based on the assumption that the universe is computable; it assigns a halting oracle a prior probability (and therefore a posterior probability after any finite amount of evidence) of zero. In other words, while human intuition is finitely skeptical of manifest infinity, SI is infinitely skeptical.

This has been used as a reductio ad absurdum of SI, but is that correct? If a halting oracle really is absolutely impossible or at least infinitely improbable across the Tegmark multiverse by a correct weighting, then SI is right and human intuition is wrong. If not, then vice versa. At this time, I don't know which is the case, or even whether there is a fact of the matter regarding which is the case.

In the absence of aliens offering unlikely bargains, this would appear to be of little concern, but consider a much more familiar object: the humble electron.

When we measure the spin of an electron, how much information can we get? One bit: up or down.

But how much computation is the universe doing behind the scenes? According to quantum mechanics, the spin of an electron is represented by a complex number, which if taken at face value would mean the universe is actually doing infinite computation to provide us with one bit. Nor is this entirely unobservable, because by repeated measurements we can verify that quantum mechanics seems to work; the probability distribution we get is that which would be given if the spin really were represented as a complex number.

On a larger scale, current theory strongly conjectures that the maximum information contained in a volume is given by the Bekenstein bound, one bit per Planck area give or take a small constant factor. Leaving aside the surprising holographic theory that gives a limit in terms of area rather than volume as we would intuitively expect, this sounds perfectly reasonable - except does "information contained in a volume" refer, like the one bit obtained from measuring the spin of an electron, only to the information we can extract? Or is it an indicator of some final limit to the computation the universe performs behind the scenes?

Put another way, is space *really* granular at the Planck scale of 1e-35 meters? Or does the universe go ahead and implement infinitely subdivisible space, with the Planck limit only being on what use we can make of it? How skeptical should we be of manifest infinity?

For what it's worth, my intuitive preference is for the finite answer, to an even greater extent than with the halting oracle; notwithstanding that I know very well the universe is not constrained by my ideas of efficiency, it *still* strikes me as grossly inefficient to the point of inelegance for the universe to perform infinite computation of which only a finite fraction can be used even in principle.

Which was why I was distinctly disconcerted when I read this result: http://www.cosmosmagazine.com/node/4472.

(In a nutshell, somebody calculated that if space is really granular at 1e-35 meters, that should actually affect the propagation of polarized gamma rays from a GRB 300 million light years distant, in a measurable way. Measurement found no such effect, apparently showing that the universe calculates space down to at least 1e-48 meters.)

Must we, contrary to Solomonoff induction, accept the likelihood of manifest infinity after all? Or is there another interpretation of these results?