Posts

Sorted by New

Wiki Contributions

Comments

I've read so many posts highlighting the dangers of AGI that I often feel terribly anxious about it. I'm pretty young, and the idea that there's a possible utopia waiting for us that seems to be slipping through our fingers kills me. But even more than that, I worry that I won't have the chance to enjoy much of my life. That the work I've put in now won't amount to much, and that the relationships I've cultivated will never really get the chance to grow for the decades that should be every human's right.

Even just earlier today, I was reading an article when my cat came up to me and started rolling around next to my leg, purring and playing with me. She's pretty old- certainly not young enough for any chance at biological immortality. I was struck by the sense that I should put down my laptop and play with her, because the finite life she has here deserves to be filled with joy and love. That even if there's no chance for her to live forever, that what she has should and has been made better by me. A long, full life of satisfaction is enough for her.

I don't necessarily mind on missing out on utopia. I'd obviously like it to happen, but its inconceivable to me. So if a billion years of technologically-enhanced superhumanity isn't in the cards for me? I'll be okay.

But there's no one there to make sure that I get the full allotment of life that I've got left. I feel overwhelmed by the sense that in a few decades from now, if this problem isn't solved, the flame of my life will be snuffed out by a system I don't understand and could never defeat. I'll never have that long-term marriage, that professional career, or the chance to finally achieve the expert level in my hobbies. I'll just be gone, along with everything else that could possibly matter to me. 

If I can't have immortality, I at least want a long, peaceful life. But the threat of AGI robs me of even that possibility, if its as certain a disaster as I've come to believe.

I think the general point he's making still stands. You can always choose to remove the Werewolf Contract of your own volution, then force any sort of fever dream or nightmare onto yourself.

Moreover, The Golden Age also makes a point about the dangers of remaining unchanged. Orpheus, the most wealthy man in history, has modified his brain such that his values and worldview will never shift. This puts him in sharp contrast to Phaethon as the protagonist, whose whole arc is about shifting the strict moral equilibrium of the public to make important change happen. Orpheus, trapped in his morals, is as out of touch in the era of Phaethon as would be a Catholic crusader in modern Rome.

Answer by Felix C.Mar 06, 2023102

I think the example of humans militaries to ants is a bit flawed, for two main reasons.

1. Ants don't build AGI - Humans don't care about ants because they're so uncoordinated in comparison, and can't pose much of a threat. Humans can pose a significant threat to an ASI - building another ASI.

2. Ants don't collect gold - Humans, unlike ants, control a lot of important resources. If every ant nest was built on a pile of gold, you can best believe humans would actively look for and kill ants. Not because we hate ants, but because we want their gold. An unaligned ASI will want our chip factories, our supply chains, bandwidth, etc. All of which we would be much better off keeping.