LESSWRONG
LW

AI
Frontpage

0

Pausing for what?

by MountainPath
21st Oct 2024
1 min read
1

0

AI
Frontpage

0

Pausing for what?
5MountainPath
New Comment
1 comment, sorted by
top scoring
Click to highlight new comments since: Today at 11:14 AM
[-]MountainPath10mo52

Strongest reason for pausing and AI safety I can think about: In order to build a truth-seeking super intelligence, that not only maximises paperclips, but also tries to understand the nature of the universe, you need to align it to that goal. And we have not accomplished this yet or figured out how to do so. Hence, regardless of whether you believe in the inherent value of humanity or not, AI safety is still important, and pausing probably too. Otherwise we won’t be able to create a truth-seeking ASI.

Reply
Moderation Log
More from MountainPath
View more
Curated and popular this week
1Comments

I care about AI safety because I believe that preserving humanities potential is one of the most important things to work on. The Precipice lays out a future of humanity that I find very inspiring. 

Preserving humanities potential seems so important to me because it will enable us to build the institutions and technology that will allow us to understand the nature of the universe. What is the truth? Is there something such as the truth? Is there some destiny that we ought to realise?

As uncontrollable AI seems like the biggest reason why we (humanity) won't be able figure these questions out, a big part of my mission in life is to make sure that AI does not kill everyone.

This also made me become worried that we don't have enough time to solve the alignment problem. That we need to pause to buy us enough time. 

Today I encountered a crux that might change my entire perspective, and I feel a bit intimidated my it.

Crux: What if more intelligent species are simply better at answering the questions as above? 

Just because we can feel pleasant and bad emotions, and we have a consciousness, does that make us more valuable?  Are these the things that should be prioritised over truth seeking? I genuinely don't know.

People that support a pause posit that this is our only hope to survive. And I think that is likely true. But what is if the ultimate goal is not the survival of the human species, but to understand the truth of the universe, and to act on it? 

What if humans are not in the best position to actually do that? What is if a super intelligence can realise this goal in the most effective way?