I might write a top level post or shortform about this at some point. I find it baffling how casually people talk about dismantling the Sun around here. I recognize that this post makes no normative claim that we should do it, but it doesn't say that it would be bad either, and expects that we will do it even if humanity remains in power. I think we probably won't do it if humanity remains in power, we shouldn't do it, and if humanity disassembles the Sun, it will probably happen for some very bad reason, like a fanatical dictatorship getting in power.
If we get some even vaguely democratic system that respects human rights at least a little, then many people (probably the vast majority) will want to live on Earth in their physical bodies and many will want to have children, and many of those children will also want to live on Earth and have children on their own. I find it unlikely that all subcultures that want this will die out on Earth in 10,000 years, especially considering the selections effects: the subcultures that prefer to have natural children on Earth are the ones that natural selection favors on Earth. So the scenarios when humanity dismantles the Sun probably inv...
You are putting words in people's mouths to accuse lots of people of wanting to round up the Amish and hauling them to extermination camps, and I am disappointed that you would resort to such accusations.
So, I'm with you on "hey guys, uh, this is pretty horrifying, right? Uh, what's with the missing mood about that?".
The issue is that not-eating-the-sun is also horrifying. i.e. see also All Possible Views About Humanity's Future Are Wild. To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise "a small fraction of the cosmos?". Sure. But, (quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), a two year delay would be... 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn't sound like much except that's 200 galaxies lost.
When you compare "the Amish get a Sun Replica that doesn't change their experience", the question is "Is it worth throwing away 80 trillion stars for the Amish to have the real thing." It does not seem obviously worth it.
IMO there isn't an option that isn't at least a bit horrifying in some sense that one could have a missing mood about. And while I still feel unsettled about it, I think if I have to grieve something, makes more sense to grieve in the direction of "don't throw away 80 ...
To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise “a small fraction of the cosmos?”. Sure. But, (*quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), *a two year delay would be… 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn’t sound like much except that’s 200 galaxies lost.
Why is this horrifying? Are we doing anything with those galaxies right now? What is this talk of “throwing away”, “lost”, etc.?
You speak as if we could be exploiting those galaxies at the extreme edge of the observable universe, like… tomorrow, or next week… if only we don’t carelessly lose them. Like we have these “resources” sitting around, at our disposal, as we speak. But of course nothing remotely like this is true. How long would it even take to reach any of these places? Billions of years, right? So the question is:
“Should we do something that might possibly somehow affect something that ‘we’, in some broad sense (because who even knows whether humanity will be around at the time, or in what form), will be do...
The argument is (I assume):
Without making any normative arguments: if you're in a position (industrially and technologically) to disassemble the sun at all, or build something like a Dyson swarm, then it's probably not too difficult to build an artificial system to light the Earth in such a way as to mimic the sun, and make it look and feel nearly identical to biological humans living on the surface, using less than a billionth of the sun's normal total light output. The details of tides might be tricky, but probably not out of reach.
This point doesn't make sense to me. It sounds similar to saying "Most people don't like it when companies develop more dense housing in cities, therefore a good democracy should not have it" or "Most people don't like it when their horse-drawn carriages are replaced by cars, therefore a good democracy should not have it".
The cost-benefit calculations on these things work out and it's good if most uninformed people who haven't spent much time on it are not able to get in the way of companies that are building goods and services in this regard.
There are many many examples (e.g. GMOs, nuclear power, coal power, privatized toll roads, fracking, etc), and I expect if I researched for a few hours I would find even clearer examples for which it is currently consensus that it is a good idea, but at the time the majority disliked it.
More generally:
This (the Sun is the only important local source of ... anything) has been an obvious conclusion for decades. Freeman Dyson described one avenue to capturing all the energy in 1960. The recent changes that make it more salient (and framed as "eating or inhabiting the sun" rather than "capturing the sun's output") is the recent progress in AI, which does two things:
That said, I don't know how to make beliefs on this scale pay any rent. "within 10,000 years" and "that's just science fiction" are identical labels to me.
I think that it's likely to take longer than 10000 years, simply because of the logistics (not the technology development, which the AI could do fast).
The gravitational binding energy of the sun is something on the order of 20 million years worth of its energy output. OK, half of the needed energy is already present as thermal energy, and you don't need to move every atom to infinity, but you still need a substantial fraction of that. And while you could perhaps generate many times more energy than the solar output by various means, I'd guess you'd have to deal with inefficiencies and lots of waste heat if you try to do it really fast. Maybe if you're smart enough you can make going fast work well enough to be worth it though?
I'm not sure what the details would look like, but I'm pretty sure ASI would have enough new technologies to figure something out within 10,000 years.
I feel like this is the main load-bearing claim underlying the post, but it's barely argued for.
In some sense the sun is already "eating itself" by doing a fusion reaction, which will last for billions more years. So you're claiming that AI could eat the sun (at least) six orders of magnitude faster, which is not obvious to me.
I don't think my priors on that are very different from yours but the thing that would have made this post valuable for me is some object-level reason to upgrade my confidence in that.
If our story goes well, we might want to preserve our Sun for sentimental reasons.
We might even want to eat some other stars just to prevent the Sun from expanding and dying.
I would maybe want my kids to look up at a night sky somewhere far away and see a constellation with the little dot humanity came from still being up there.
Concrete existence, they point out, is less resource efficient than dreams of the machine. Hard to tell how much value is tied up in physical form and not computation, if humans would agree on this either way on reflection.
[..] requires eating the Sun, and will be feasible at some technology level [..]
Do we have some basic physical-feasibility insights on this or you just speculate?
It's a pretty straightforward modification of the Caplan thruster. You scoop up bits of sun with very strong magnetic fields, but rather than fusing it and using it to move a star, you cool most of it (firing some back with very high velocity to balance things momentum wise) and keep the matter you extract (or fuse some if you need quick energy). There's even a video on it! Skip to 4:20 for the relevant bit.
I wonder how hard it would be to make the Sun stop shining? Maybe the fusion reaction could be made subcritical by adding some "control rod" type stuff.
Edit: I see other commenters also mentioned spinning up the Sun, which would lower the density and stop the fusion. Not sure which approach is easier.
I'm not sure eating the sun is such a great idea,
If the sun goes out suddenly, it's a pretty clear tipoff that something major is happening over here. Anyone with preferences who sees that might worry about having to compete with whoever ate the sun. They could do something drastic.
Our offspring might conclude that anyone willing to do drastic things to strangers would already be going hard on spreading and eating suns, so it would only signal meaningfully to relatively peaceful types. But I'm not sure we could be sure. Someone might be hiding and doing dr...
I think this shades into dark forest theory. Broadly my theory about aliens in general is that they're not effectively hiding themselves, and we don't see them because any that exist are too far away.
Partially it's a matter of, if aliens wanted to hide, could they? Sure, eating a star would show up in terms of light patterns, but also, so would being a civilization at the scale of 2025-earth. And my argument is that these aren't that far-off in cosmological terms (<10K years).
So, I really think alien encounters are in no way an urgent problem: we won't encounter them for a long time, and if they get light from 2025-Earth, they'll already have some idea that something big is likely to happen soon on Earth.
This does not matter for AI benchmarking because by the time the Sun has gone out, either somebody succeeded at intentionally engineering and deploying [what they knew was going to be] an aligned superintelligence, or we are all dead.
This seemed like a nice explainer post, though it's somewhat confusing who the post is for – if I imagine being someone who didn't really understand any arguments about superintelligence, I think I might bounce off the opening paragraph or title because I'm like "why would I care about eating the sun."
There is something nice and straightforward about the current phrasing but suspect there's an opening paragraph that would do a better job explaining why you might care about this.
(But I'd be curious to hear from people who weren't really sold on any singularity stuff who read it and can describe how it was for them)
I agree with Richard Ngo and Simon that any dismantling of the sun is going to be a long-term project, and this matters.
The Sun is the most nutritious thing that's reasonably close. It's only 8 light-minutes away, yet contains the vast majority of mass within 4 light-years of the Earth. The next-nearest star, Proxima Centauri, is about 4.25 light-years away.
By "nutritious", I mean it has a lot of what's needed for making computers: mass-energy. In "Ultimate physical limits to computation", Seth Lloyd imagines an "ultimate laptop" which is the theoretically best computer that is 1 kilogram of mass, contained in 1 liter. He notes a limit to calculations per second that is proportional to the energy of the computer, which is mostly locked in its mass (E = mc²). Such an energy-proportional limit applies to memory too. Energy need not be expended quickly in the course of calculation, due to reversible computing.
So, you need energy to make computers out of (much more than you need energy to power them). And, within 4 light-years, the Sun is where almost all of that energy is. Of course, we don't have the technology to eat the Sun, so it isn't really our decision to make. But, when will someone or something be making this decision?
Artificial intelligence that is sufficiently advanced could do everything a human could do, better and faster. If humans could eventually design machines that eat the Sun, then sufficiently advanced AI could do so faster. There is some disagreement about "takeoff speeds", that is, the time between when AI is about as intelligent as humans, to when it is far far more intelligent.
My argument is that, when AI is far far more intelligent than humans, it will understand the Sun as the most nutritious entity that is within 4 light-years, and eat it within a short time frame. It really is convergently instrumental to eat the Sun, in the sense of repurposing at least 50% its mass-energy to make machines including computers and their supporting infrastructure ("computronium"), fuel and energy sources, and so on.
I acknowledge that some readers may think the Sun will never be eaten. Perhaps it sounds like sci-fi to them. Here, I will argue that Sun-eating is probable within the next 10,000 years.
Technological development has a ratchet effect: good technologies get invented, but usually don't get lost, unless they weren't very important/valuable (compared to other available technologies). Empirically, the rate of discovery seems to be increasing. To the extent pre-humans even had technology, it was developed a lot more slowly. Technology seems to be advancing a lot faster in the last 1000 years than it was from 5000 BC to 4000 BC. Part of the reason for the change in rate is that technologies build on other technologies; for example, the technology of computers allows discovery of other technologies through computational modeling.
So, we are probably approaching a stage where technology develops very quickly. Eventually, the rate of technology development will go down, due to depletion of low-hanging fruit. But before then, in the regime where technology is developing very rapidly, it will be both feasible and instrumentally important to run more computations, quickly. Computation is needed to research technologies, among other uses. Running sufficiently difficult computations requires eating the Sun, and will be feasible at some technology level, which itself probably doesn't require eating the Sun (eating the Earth probably provides more than enough energy to have enough computational power to figure out the technology to eat the Sun).
Let's further examine the motive for creating many machines, including computers, quickly. Roughly, we can consider two different regimes of fast technology development: coordinated and uncoordinated.
A coordinated regime will act like a single agent (or "singleton"), even if it's composed of multiple agents. This regime would do some kind of long-termist optimization (in this setting, even a few years is pretty long-term). Of course, it would want to discover technology quickly, all else being equal (due to astronomical waste considerations). But it might be somewhat "environmentalist" in terms of avoiding making hard-to-reverse decisions, like expending a lot of energy. I still think it would eat the Sun, on the basis that it can later convert these machines to other machines, if desired (it has access to many technologies, after all).
In an uncoordinated regime, multiple agents compete for resources and control. Broadly, having more machines (including computers) and more technology grants a competitive advantage. That is a strong incentive to turn the Sun into machines and develop technologies quickly. Perhaps an uncoordinated regime can transition to a coordinated one, as either there is a single victor, or the most competitive players start coordinating.
This concludes the argument that the Sun will be largely eaten in the next 10,000 years. It really will be a major event in the history of the solar system. Usually, not much happens to the Sun in 10,000 years. And I really think I'm being conservative in saying 10,000. This would in typical discourse be considered "very long ASI timelines", under the assumption that ASI eats the Sun within a few years.
Thinking about the timing of Sun-eating seems more well-defined, and potentially precise, than thinking about the timeline of "human-level AGI" or "ASI". These days, it's hard to know what people mean by AGI. Does "AGI" mean a system that can answer math questions better than the average human? We already have that. Does it mean a system that can generate $100 billion in profit? Obvious legal fiction.
Sun-eating tracks a certain stage in AGI capability. Perhaps there are other concrete, material thresholds corresponding to convergent instrumental goals, which track earlier stages. These could provide more specific definitions for AGI-related forecasting.