Robin Hanson's writing on Grabby Aliens is interesting to me, since it seems to be one of the more sound attempts to apply mathematical reasoning to the Fermi Paradox. Unfortunately, it still relies on anthropics, so I wouldn't the slightest bit surprised if it was off by an order of magnitude (or two) in either direction.
What I would like to know (preferably from someone with a strong astronomy background) is: how confident are we that there are no (herein defined) Extremely Obvious Aliens?
Extremely Obvious Aliens
Define Extremely Obvious Aliens in the following way:
- They colonize every single star that they encounter by building a Dyson swarm around it that reduces visible radiation by at least 50%
- They expand in every direction at a speed of at least 0.5C
- They have existed for at least 1 billion years
If such aliens existed, it should be really easy to detect they by just looking for a cluster of galaxies that is 50% dimmer than it should be which is at least 0.5Billion light years across.
How confident are we that there are no Extremely Obvious Aliens?
As with Grabby aliens, it is safe to say there are no Extremely Obvious Aliens in the Solar System. Nor, for that matter are there any Extremely Obvious Aliens within 0.5BLY of the Milky Way Galaxy.
So, for my astronomy friends. What is the biggest radius for which we can confidently say there are 0 Extremely Obvious Aliens? The best answer I can come up with is SLOAN, which was done at a redshift of z=0.1, which I think corresponds to a distance of 1.5BLY.
Is this accurate? Namely, is it safe to say (with high confidence) there are no Extremely Obvious Aliens within 1.5BLY of Earth?
Is there another survey that would let us raise this number even higher?
What is the theoretical limit (using something like JWST)?
Has someone written a good paper answering questions like these already?
A civilization somehow constraining itself to merely use dyson swarms that block 50% of the light is implausible, it's much better to just create a small black hole (say, by concentrating very powerful lasers into a small region of space) and throw it into the star. That way you store up all the star's mass-energy for as long as possible until you use the Penrose process to extract however much of the energy you want from it. In fact, you could even launch these small black holes at close to the speed of light and aim them at stars, preserving their energy for the time it takes the slower-moving parts of civilization to show up.
And if this civilization converts every star they encounter, then this should show up as a completely dark sphere in the universe, which ought to be extremely obvious.
Which, notably, we do see (https://en.m.wikipedia.org/wiki/Boötes_void). Though they don't conflict with our models of how the universe would end up naturally.
Unlike what you would expect with black holes, we can see that the Boötes void contains very little mass by looking for gravitational lensing and the movement of surrounding galaxies.
In our simulations, we find it overwhelmingly likely that any such spherical volume of an alien civ would be much larger than the full moon in the sky. So no need to study distant galaxies in fine detail; look for huge spheres in the sky.
I have not seen anyone do something like this but it sounds like something Anders Sandberg (FHI) would do. If you want a lead or want to find someone that might be interested in researching it, he might be it.
How sure are we that dark matter isn't computronium?
1:10^12 odds against the notion, easily. About as likely as the earth being flat.
I can imagine no situation where something that is a required part of computational processes could ever present itself to us as dark matter, and no mistake in physics thorough enough to allow it.
How did you get this figure? Two one-in-a-million implausibilities?
Quantum computers are close to reversible. Each halo could be a big quantum coherent structure, with e.g. neutrinos as ancillary qubits. The baryonic world might be where the waste information gets dumped. :-)
Before learning about reversible computation only requiring work when bits are deleted I would have treated each of my points as roughly independent with about 10^1.5 , 10^4 , 10^4 , 10^2.5 odds against respectively. The last point is now down to 10^1.5 .
Dumping waste information in the baryonic world would be visible.
Not if the rate is low enough and/or astronomically localized enough.
It would be interesting to make a model in which fuzzy dark matter is coupled to neutrinos, in a way that maximizes rate of quantum information transfer, while remaining within empirical bounds.
Contra #1: Imagine you order a huge stack of computers for massive multiplayers game purpose. Would you expect it might collapse under it’s own weight, or would you expect the builders to be cautious enough that it won’t collapse like passive dust in free fall?
Contra #4: nope. Landauer’s principle implicates that reversible computation cost nothing (until you’d want to read the result, which then cost next to nothing time the size of the result you want to read, irrespective of the size of the computation proper). Present day computers are obviously very far from this limit, but you can’t assume « computronium » is too.
#2 and #3 sounds stronger, imo. Could you provide a glimpse of the confidence intervals and how it varies from one survey to the next?
#1 - Caution doesn't solve problems, it finds solutions if they exist. You can't use caution to ignore air resistance when building a rocket. (Though collapse is not necessarily expected - there's plenty of interstellar dust).
#4 - I didn't know about Landauer's principle, though going by what I'm reading, you're mistaken on its interpretation - it takes 'next to nothing' times the part of the computation you throw out, not the part you read out, where the part you throw out increases proportional to the negentropy you're getting. No free lunch, still, but one whose price is deferable to the moment you run out of storage space.
That would make it possible for dark matter to be part of a computation that hasn't been read out yet, though not necessarily a major part: I'm not sure the below reasoning is correct, but The Landauer limit with the current 2.7K universe as heat bath is 0.16 meV per bit. This means that the 'free' computational cycle you get from the fact that you only need to pay at the end would, to a maximally efficient builder, reward them with 0.16 meV extra for every piece of matter that can hold one bit. We don't yet have a lower bound for the neutrino mass, but the upper bound is 120 meV. If the upper bound is true, that would mean you would have to cram 10^3 bits in a neutrino before using it as storage nets you more than burning it for energy (by chucking it into an evaporating black hole).
I don't have data for #2 and #3 at hand. It's the scientific consensus, for what that's worth.
1-3: You are certainly right that cold and homogenous black matter is the scientific consensus right now (at least if by consensus we mean « most experts would either think that’s true or admit there is no data strong enough to convince most experts it’s wrong »).
The point I’m trying to make is: as soon as we say « computronium » we are outside of normal science. In normal science, you don’t suppose matter can choose to deploy itself like a solar sail and use that to progressively reach outside regions of the galaxy where dangerous SN are less frequent. You suppose if it exists it has no aim, then find the best non-weird model that fits the data.
In other words, I don’t think we can assume that the scientific consensus is automatically 10^4 or 10^8 strong evidence for « how sure are we that black matters is not a kind of matter that astrophysicist usually don’t botter to consider? », especially when the scientific consensus also includes « we need to keep spending ressources on figuring out what black matter is ». You do agree that’s also the scientific consensus, right? (And not just to keep labs open, but really to add data and visit and revisit new and old models because we’re still not sure what it is)
4: in the theory of purely reversible computation, the size of what you read dictates the size you must throw out. Your computation is however more sounded than the theory of pure reversible computation, because pure reversible computation may well be as impossible as perfectly analog computation. Now, suppose all black matters emits 0,16 mev/bit. How much computation per second and kilo would let the thermal radiation largely below our ability to detect it?
Reading the results isn't the only time you erase bits. Any time you use an "IF" statement, you have to either erase the branch that you don't care about or double the size of your program in memory.
Any time you use an « IF » statement: 1) you’re not performing a reversible computation (e.g. your tech is not what minimise energy consumption); 2) the minimal cost is one bit, irrespective of the size of your program. Using MWI you could interpret this single bit as representing « half the branches », but not half the size in memory.
Unless it's computronium made of non-interacting matter, fairly. It's not just distant galaxies, there's plenty in the Milky Way too
Non interaction is exactly what you want if shielding things from causal interaction let's you slip down the entropy gradient slower, or something even more exotic.
This is an interesting question. Thank you for asking it.
Why do we think aliens would do things with stars? How can we be sure that our reasoning isn't similar to that of a medieval nobleman trying to gauge the power of the US today?
"How many castles are in their realm? No castles? What, they can field hundreds of thousands of men-at-arms but no horse? What sort of backwards land is this? Is this another realm like the Aztec empire I heard rumours about? Enormous gold reserves and huge armies but none of the refinements of the civilized world! Let's invade!"
You can see how they would make incorrect assumptions if they got to ask the questions!
95% of the universe is 'dark'. What if there's some special method that makes Dyson Spheres obsolete?
The medieval would be confused by all our fields left fallow for the grass to grow, all the woodland right next to cities. It must be a desolate land, laid waste by war. People need to cultivate food, people need to burn wood for fuel. Maybe there was a plague?
I think aliens are supremely obvious and outweigh the visible universe. I think they have some superior method of manipulating matter and energy. Maybe they are using pre-existing stocks of dark matter rather than tapping stars. We would see them disappearing if they did that. Or maybe they have some way to generate matter ex nihilo (which does admittedly go against thermodynamics).
Anyway, we must be many tiers below them. Maybe if you increase the energy of your particle accelerator 100,000,000x you find something exciting that breaks all the rules! Maybe you have to be a superintelligence to get a grasp of what's really going on (quantum gravity) and do some unfathomably complicated technical feat with the resources of a Type 1.5 civilization to unlock the next tier of technologies.
Either there isn't any life or nanotech and Dyson Spheres aren't the final level of achievement. We can't even unite the fundamental forces. So I think the latter is the case.
A medieval lord would be very confused by our current world, yes.
He wouldn't fail to notice it. No, we don't have castles. But a medieval lord who saw New York City would be very unlikely to say 'hah, no castles, what weak primitives these people are', and even less likely to say 'what? No grain fields? Clearly there is no civilization here!'
It's possible that aliens are more advanced in ways we don't understand. However, they would e.g. have to be advanced in ways that result in them entirely ignoring thermodynamics to have no use for the huge amounts of energy in stars.
The medieval lord doesn't get to see New York. He's asking about things he knows well: troops, castles, woodland, farmland. Towns and cities are small and less significant remember? All societies are agrarian! He doesn't get to see what we want to show him, he's asking us questions and we're answering and wishing we could say 'yes but you should be asking about our arsenal of nuclear submarines that fire 12 missiles each with 8 warheads that can incinerate an entire army anywhere in the world within 30 minutes'
We're looking at stars, the things we know well. Stars, black holes, planets and dust are 5% of the universe. The entire visible universe is not huge nor does it have much energy, it is dwarfed by the dark stuff we don't understand.
The entire universe is being torn apart by two mysterious forces that we cannot identify! We are staring at something enormously powerful. If we can't identify life in the 5% we understand well, life is very likely in the other 95%.
I'm partial to this idea. People think of a dyson swarm as a pinnacle technology when it's really more of an engineering problem than a scientific problem for boring modern people. If there are highly advanced aliens out there I see it being very possible that their expansion is completely invisible to us because they're operating on a level we can't perceive.