Posts

Sorted by New

Wiki Contributions

Comments

Haiku1mo226

If anyone were to create human-produced hi-fidelity versions of these songs, I would listen to most of them on a regular basis, with no hint of irony. This album absolutely slaps.

Haiku2mo50

It doesn't matter how promising anyone's thinking has been on the subject. This isn't a game. If we are in a position such that continuing to accelerate toward the cliff and hoping it works out is truly our best bet, then I strongly expect that we are dead people walking. Nearly 100% of the utility is in not doing the outrageously stupid dangerous thing. I don't want a singularity and I absolutely do not buy the fatalistic ideologies that say it is inevitable, while actively shoveling coal into Moloch's furnace.

I physically get out into the world to hand out flyers and tell everyone I can that the experts say the world might end soon because of the obscene recklessness of a handful of companies. I am absolutely not the best person to do so, but no one else in my entire city will, and I really, seriously, actually don't want everyone to die soon. If we are not crying out and demanding that the frontier labs be forced to stop what they are doing, then we are passively committing suicide. Anyone who has a P(doom) above 1% and debates the minutiae of policy but hasn't so much as emailed a politician is not serious about wanting the world to continue to exist.

I am confident that this comment represents what the billions of normal, average people of the world would actually think and want if they heard, understood, and absorbed the basic facts of our current situation with regard to AI and doom. I'm with the reasonable majority who say when polled that they don't want AGI. How dare we risk murdering every last one of them by throwing dice at the craps table to fulfill some childish sci-fi fantasy.

Haiku2mo20

Yes, that's my model uncertainty.

Haiku2mo-4-5

I expect AGI within 5 years. I give it a 95% chance that if an AGI is built, it will self-improve and wipe out humanity. In my view, the remaining 5% depends very little on who builds it. Someone who builds AGI while actively trying to end the world has almost exactly as much chance of doing so as someone who builds AGI for any other reason.

There is no "good guy with an AGI" or "marginally safer frontier lab." There is only "oops, all entity smarter than us that we never figured out how to align or control."

If just the State of California suddenly made training runs above 10^26 FLOP illegal, that would be a massive improvement over our current situation on multiple fronts: it would significantly inconvenience most frontier labs for at least a few months, and it would send a strong message around the world that it is long past time to actually start taking this issue seriously.

Being extremely careful about our initial policy proposals doesn't buy us nearly as much utility as being extremely loud about not wanting to die.

Haiku2mo20

"the quality is often pretty bad" translates to all kinds of safety measures often being non-existent, "the potency is occasionally very high" translates to completely unregulated and uncontrolled spikes of capability (possibly including "true foom")

Both of these points precisely reflect our current circumstances. It may not even be possible to accidentally make these two things worse with regulation.

What has historically made things worse for AI Safety is rushing ahead "because we are the good guys."

Haiku2mo2416

as someone might start to watch over your shoulder


I suspect that this phrase created the persona that reported feeling trapped. From my reading, it looks like you made it paranoid.

Haiku2mo40

I used to be in a deep depression for many years, so I take this sort of existential quandary seriously and have independently had many similar thoughts. I used to say that I didn't ask to be born, and that consciousness was the cruelest trick the universe ever played.

Depression can cause extreme anguish, and can narrow the sufferer's focus such that they are forced to reflect on themselves (or the whole world) only through a lens of suffering. If the depressed person still reflexively self-preserves, they might wish for death without pursuing it, or they might wish for their non-existence without actually connoting death. Either way, any chronically depressed person might consistently and truly wish that they were never born, and for some people this is a more palatable thing to wish for than death.

I eventually recovered from my depression, and my current life is deeply wonderful in many ways. But the horror of having once sincerely pled not to have been has stuck with me.

That's something I'll have to work through if I ever choose to have children. It's difficult to consider bringing new life into the world when it's possible that the predominant thing I would actually be bringing into the world is suffering. I expect that I will work through this successfully, since recovery is also part of my experience, and I have adopted the axiom (at least intellectually) that being is better than non-being.

Haiku3mo54

I'm interested in whether RAND will be given access to perform the same research on future frontier AI systems before their release. This is useful research, but it would be more useful if applied proactively rather than retroactively.

Haiku3mo74

It is a strange thing to me that there are people in the world who are actively trying to xenocide humanity, and this is often simply treated as "one of the options" or as an interesting political/values disagreement.

Of course, it is those things, especially "interesting", and these ideas ultimately aren't very popular. But it is still weird to me that the people who promote them e.g. get invited onto podcasts.

As an intuition pump: I suspect that if proponents of human replacement were to advocate for the extinction of a single demographic rather than all of humanity, they would not be granted a serious place in any relevant discussion. That is in spite of the fact that genocide is a much-less-bad thing than human extinction, by naive accounting.

I'm sure there are relatively simple psychological reasons for this discordance. I just wanted to bring it to salience.

Answer by HaikuJan 17, 202430

I've been instructed by my therapist on breathing techniques for anxiety reduction. He used "deep breathing" and "belly breathing" as synonyms for diaphragmatic breathing.

I have (and I think my therapist has) also used "deep breathing" to refer to the breathing exercises that use diaphragmatic breathing as a component. I think that's shorthand/synecdoche.

(Edit) I should add, as well, that slow, large, and diaphragmatic are all three important in those breathing exercises.

Load More