[Cross-post]The theoretical computational limit of the Solar System is 1.47x10^49 bits per second.

17th Oct 2023

5Joseph Van Name

4Vladimir_Nesov

2Algon

4Vladimir_Nesov

2Algon

1William the Kiwi

2Joseph Van Name

1William the Kiwi

1Dalcy

1William the Kiwi

New Comment

10 comments, sorted by Click to highlight new comments since: Today at 8:04 PM

I forgot to mention another source of difficulty in getting the energy efficiency of the computation down to Landauer's limit at the CMB temperature.

Recall that the Stefan Boltzmann equation states that the power being emitted from an object by thermal radiation is equal to . Here, stands for power, is the surface area of the object, is the emissivity of the object ( is a real number with ), is the temperature, and is the Stefan-Boltzmann constant. Here, .

Suppose therefore that we want a Dyson sphere with radius that maintains a temperature of 4 K which is slightly above the CMB temperature. To simplify the calculations, I am going to ignore the energy that the Dyson sphere receives from the CMB so that I obtain a lower bound for the size of our Dyson sphere. Let us assume that our Dyson sphere is a perfect emitter of thermal radiation so that .

Earth's surface has a temperature of about . In order to have a temperature of , our Dyson sphere needs to receive the energy per unit of area. This means that the Dyson sphere needs to have a radius of about astronomical units (recall that the distance from Earth to the sun is 1 astronomical unit).

Let us do more precise calculations to get a more exact radius of our Dyson sphere.

, so which is about 15 percent of a light-year. Since the nearest star is 4 light years away, by the time that we are able to construct a Dyson sphere with a radius that is about 15 percent of a light year, I think that we will be able to harness energy from other stars such as Alpha Centauri.

The fourth power in the Stefan Boltzmann equation makes it hard for cold objects to radiate heat.

If you wait for cosmic background radiation to cool down^{[1]}, you get much more total computation out of the same matter. The *rate* of computation doesn't seem particularly important. The amount of stuff in a Hubble volume might be reducing over time, in which case computing earlier allows more communication with distant galaxies. But given the guess about the effect size of waiting on total compute, computing locally in distant future still buys more total compute than making use of distant galaxies earlier.

I don't buy the Fermi paradox angle in the paper, obviously the first thing you do is grab all the lightcone you can get your von Neumann probes on, and prepare the matter for storage in a way that's less wasteful than the random stuff that's happening in the wild. ↩︎

That paper is wrong. There are other systems which are not near maximal entropy states and computer generated entropy can be transferred to them adiabatically at a rate of 1 bit of negentropy to erase one bit of error.

As to the post we're commenting on, the sun probably isn't the best configuration of matter to use as a power source. But this calculation seems like a reasonable lower bound.

The critique just says that you can get the same advantage even without waiting, while the relevant surprising part of the original claim is that there is a large advantage to be obtained at all, compared to Landauer limit at modern background radiation temperature, so that actually this application of the Landauer limit doesn't bound available compute.

The part of the paper I appealed to is exploratory engineering, a design that is theoretically possible but not trying to be something worthwhile when it becomes feasible in practice. This gives lower bounds on what's possible, by sketching particular ways of getting it, not predictions of what's likely to actually happen. The critique doesn't seem to take issue with this aspect of the paper.

Yea you could, but you would be waiting a while. Your reply and 2 others have made me aware that this post's limit is too low.

[EDIT: spelling]

This post uses the highly questionable assumption that we will be able to produce a Dyson sphere that can maintain a temperature at the level of the cosmic microwave background before we will be able to use energy efficient reversible computation to perform operations that cost much less than energy. And this post also makes the assumption that we will achieve computation at the level of about per bit deletion before we will be able to achieve reversible computation. And it gets difficult to overcome thermal noise at an energy level well above regardless of the type of hardware that one uses. At best, this post is an approximation for the computational power of a Dyson sphere that may be off by some orders of magnitude.

This post makes a range of assumptions, and looks at what is possible rather than what is feasible. You are correct that this post is attempting to approximate the computational power of a Dyson sphere and compare this to the approximation of the computational power of all humans alive. After posting this, the author has been made aware that there are multiple ways to break the Landauer Limit. I agree that these calculations may be off by an order of magnitude, but this being true doesn't break the conclusion that "the limit of computation, and therefore intelligence, is far above all humans combined".

I just read the abstract. Storing information in momentum makes a lot of sense as we know it is a conserved quantity. Practically challenging. But yes, this does move the theoretical limit even further away from all humans combined.

Cross posted from EA forum. Link: The theoretical computational limit of the Solar System is 1.47x10^49 bits per second. — EA Forum (effectivealtruism.org)

Part 1

The limit is based on a computer operating at the Landauer Limit, at the temperature of the cosmic microwave background, powered by a Dyson sphere operating at the efficiency of a Carnot engine. [EDIT: this proposed limit is too low, as the Landauer Limit can be broken, it is now just a lower bound.]

Relevant equations

Carnot efficiency η

_{I}=1-(T_{c}/T_{h})Landauer limit E=K

_{b}TLn(2)Bit rate R=Pη

_{I}/ERelevant values

Boltzmann constant [K

_{b}] (J K^{-1}) 1.38E-23Power output of the sun [P] (W) 3.83E+26

Temperature of the surface of the sun [T

_{h}] (K) 5.78E+03Temperature of cosmic microwave background [T

_{c}] (K) 2.73Calculations

Carnot efficiency η

_{I}=1-(T_{c}/T_{h})η

_{I}=1-(2.73/5.78E+03)η

_{I}=1.00Landauer limit E=K

_{b}TLn(2)E=1.38E-23*2.73*0.693

E= 2.61E-23 Joules per bit

Bit rate R=Pη

_{I}/ER=3.83E+26*1.00/2.61E-23

R=1.47E+49 bits per second

Notes

Numbers are shown rounded to 3 significant figures, full values were used in calculations.

Part 2

The theoretical computational limit of the solar system is 22 orders of magnitude above the estimated computational ability of all alive humans. This is based on estimates of the number of synapses in the human brain, the update rate of those synapses, and the number of humans alive. This estimate is only an approximation and should be used with caution.

The purpose of this post was to show the limit of computation, and therefore intelligence, is far above all humans combined.

Relevant equations

Bit rate of all humans R

_{humans}=N_{syn}R_{syn}N_{humans}Comparative rate R

_{c}=R_{max}/R_{humans}Relevant values

Number of synapses in the human brain [N

_{syn}] 2.50E+14Synaptic update rate [R

_{syn}] (Hz) 500Number of humans alive [N

_{humans}] 8.07E+09Theoretical computational limit [R

_{max}] (bit s^{-1}) 1.47E+49Calculation

Bit rate of all humans R

_{humans}=N_{syn}R_{syn}N_{humans}R

_{humans}=2.50E+14*500*8.07E+09R

_{humans}= 1.01E+27Comparative rate R

_{c}=R_{max}/R_{humans}R

_{c}=1.47E+49/1.01E+27R

_{c}=1E22Notes

Numbers are shown rounded to 3 significant figures, full values were used in calculations, final result rounded to one significant figure due to low confidence in synaptic update rate.

Synaptic update rate estimated based on a 2 millisecond refractory time of a neuron.