LESSWRONG
LW

469
Jordan Stone
47140
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Interstellar travel will probably doom the long-term future
Jordan Stone3mo12

Yeah there are definitely a lot of footnotes - the post was originally a lot longer but I ended up focusing in on the galactic-scale stuff. 

I think it's okay that they don't quite line up - it's clear what's happened and people can still access that extra info if they want to read it. 

Reply
Interstellar travel will probably doom the long-term future
Jordan Stone3mo20

By the time there is an interstellar civilization, humanity would almost certainly already have caused a superintelligence to be created

This is definitely correct, but I think this scenario hops over the decision-point that I'm referring to (or, the point at which the governance system I describe becomes necessary). It isn't the time of the creation of interstellar civilisation that I care about, but the point at which interstellar civilisation becomes inevitable. This may be the first interstellar colonisation mission, which begins a self-propagating expansion of human civilisation throughout the galaxy. If that propagation begins without an extremely forward-thinking governance system embedded within it (seemingly superintelligent), then the eventual creation of a galactic x-risk seems inevitable.

I think it's very plausible that superintelligence comes after the first interstellar mission. One reason is that transformative AI, initiating explosive technological growth, would presumably come before superintelligence. So we would probably gain the ability to send an interstellar colonisation mission at that point before the development of superintelligence. 

To be fair, we would almost certainly be able to catch up with an interstellar colonisation mission post-superintelligence, which might follow it quite quickly. Though, I'm not sure if this is the best way to go about ensuring long-term existential security, as they may be resistant to being governed by an AI. So I would prefer to play it safe and avoid interstellar travel completely until the problems I outlined are solved - that's more of a personal preference than a strong opinion though. 

 

And any external dangers likely can't have a meaningful technological advantage, since all technologically mature civilizations have a similar level of knowledge about the physical world and ability to produce any relevant physical technology.

I'm not sure if I understand this. Are you saying that if an external civilisation (i.e., alien civilisation) initiates a galactic x-risk, then another civilisation governed by a superintelligent AI would be able to protect itself because it would have similarly mature technology? I'm not sure if there is anything we could do to defend ourselves if an alien civilization initiated vacuum decay.

 

Also, distributed backups make any local physical damage somewhat ephemeral, as long as the rest of the universe under the same governance (expanding with colonization) has enough matter to work with, to restore anything of value that got its original physical substrate taken away.

Yeah, definitely. All of the galactic x-risks I listed though are either self-propagating or infinite in range, except maybe societal collapse or loss of value scenarios. 
 

Reply
Interstellar travel will probably doom the long-term future
Jordan Stone3mo20

Crossposted to the EA Forum: https://forum.effectivealtruism.org/posts/x7YXxDAwqAQJckdkr/interstellar-travel-will-probably-doom-the-long-term-future

Reply
28Interstellar travel will probably doom the long-term future
3mo
6