Aren't these just the classic stages of grief? There's denial (thinking it can't be true), anger and bargaining (various amounts of trying to do something, to lash back, to find ways in which maybe we can get out of it, save ourselves from the Permanent Underclass or whatever), depression (doom and gloom) and finally acceptance.
It's hard to also say what is the appropriate point to stop. Maybe it really is all overinflated and we're stressing over nothing. Maybe there really are things we can do to stop the very specific human culprits of this whole mess and not doing them is cowardly and lazy, not wise. And no matter how much one wants to be positive and whimsical, the death of all does sound like a somewhat depressing prospect. WWPD (What Would Pollyanna Do)?
The classic stages of grief don't happen in order or all happen. They're just things grief can cause.
You can line whimsically and work to save the world. Choose your own balance.
You don't have to be technical to do important work. Public opinion creates governance and spreads the meme that this stuff is dangerous and everyone building it should be very cautious.
Can't do reactions on top level posts for some reason, but if I could I would "heart" this. It's a weird time to be alive but (upon reflection) there is a lot to be grateful for.
As someone whose worldview was upended over the last few years because of AI progress, this post resonates. Sometimes the situation we are in just seems absurd – like, all I can do is laugh, shrug, and then go back to what I was doing. And I think this is an emotionally healthy response. Sometimes the best reaction to reality is an unbothered acknowledgement of it's absurdity.
But I do worry that speaking of doom like this – as if it is nearly-certain[1] – is counterproductive.
I think everyone should invest time in mentally preparing for an uncertain future. Finding a way to be okay with the possibility of disaster, while staying motivated to work hard to avoid it.[2]
This is important whether or not you think doom is likely. In all worlds, those who can be effective regardless of their perceived odds of success, are the ones most capable of succeeding.
I have personally managed to find a healthier relationship with the world; retaining some whimsy and optimism despite a sober reckoning with reality. And I feel like I can actually achieve things now, as a result.
I want others to experience this as well. The ideas have already been discussed, but emotionally internalizing them takes a while, and it certainly did for me.
I hope we find better ways of communicating this so that all of us could get better at achieving our goals 🙂
When the emotional advice you give is downstream of that prediction. Like, Dying with Dignity or "Life may be very short. So make the next few years the best ones."
I've been thinking about this comment a lot, although I can't attest to any of the specific recommendations.
To me it feels pretty emotionally clear we are nearing the end-times with AI. That in 1-4 years[1] things will be radically transformed, that at least one of the big AI labs will become autonomous research organizations working on developing the next version of their AI, perhaps with some narrow guidance of humans in oversight or acquisition of more resources until robotics is solved too.
And i believe there will be some nice benefits at first with this, with the AI organizations providing many goods and services in exchange for money, to raise capital so that the self-improvement resource acquisition loop can continue.
But I’m not sure how it will ultimately turn out. Declaring risk of extinction-level events less than 10% seems overconfident. Yet, declaring the risks to be >90% also seems overconfident. But I generally remain quite uncertain about which factors will dominate. Maybe AIs will remain friendly and, for decision theory reasons, continue put some fraction of resources to look after us to some extent, as a signal that future entities should do the same for them. Maybe the loop of capital acquisition is so brutal and molochian that doom wins at least once. And people have been confidently wrong about doom in the past. So i remain unsure. I just say I'm 50:50 on it.
But it also feels like, as an individual who does not have any particularly position of influence or power, things are mostly out of my control. There are actions i can take that can maybe push things one way or another. I should seriously do these actions, and the bottleneck feels mostly like not exploring the option space enough.
But how should one feel about it all?
If one seriously believes that one has ~2 years left where either you die, or actions will become insignificant, what should one do?
One emotion one can feel at first is often a sense of doom and despair. That there is “nothing one can do”. That one should wallow in self-pity, “woe be me, i wish i could have a longer life”. I get it.
But also, just get over it.
Maybe i find it easier since I have emotionally grappled with conclusions of Nihilism a lot before. But really the only way out is to not care that you will eventually die (whether soon or at the heat-death of the universe), and to try live a good life anyway.
But it is possible to just choose a better reaction and not worry about it.[2]
Another emotion one can feel is a frantic “I must do something, I must do something”. I think this is a pretty reasonable emotion to feel. You should probably follow it. It mostly feels like the phases for this are something like: 1) Thinking of one’s own ideas for a while and feeling you can do things. 2) Realizing most of the ideas you thought of are already being tried by others, and feeling hopeless. 3) Realizing that despite this, there is work to be done that could plausibly be useful, and maybe it feels marginal, but you should do it anyway.
If doing research, I think it is worth having some loops of [exploring which thing seems actually most useful to do right now] and [spending time exploiting that thing to the point you’ve made some substantial progress]. These days with Claude Code, the latter seems particularly easy to do, and will probably keep getting easier to do. Sometimes this may mean means that the value may be tilted towards [slightly improving the kind of work AI will do in the future] rather than [make something directly useful now]. I guess I think both kinds of work seem valuable, but it’s worth having this in your mind explicitly.
There is little enough time that you should do any work you can, but enough time that you need to be considerate on how you spend your time, and not get burnt out[3]. If you are the type of person who can spend 100% of their days working on something with deep focus for years, go do that, but you’re probably not reading this if you are.
But I think with these short timelines, there is another thing you can feel, which is that the life you have left may be pretty short. Maybe you will live, maybe not, but even if we live, your life will be so different and transformed.
I used to be very emotionally bought-into some things: delayed-gratification. FIRE. marshmallow test. Being Stoic. Don’t burn bridges. don’t make anyone dislike you. don’t stand out for the wrong reasons. alter your thoughts and behaviors as to not be too cringe. Delay things until after the singularity.
I think even in normal circumstances, erring too much towards these is not great.[4]
But with short timelines, it feels like an extreme waste of what little valuable time we have left to be exclusively worrying about these things[5].
You should have fun.
You should do things that you think are weird.
You should spend your money on things that will improve your life. Yes even that too. I know it’s painful.
You should notice the subtle things that make you sad, and not just brush them off, but fix them.
Don’t compromise on your morals or other things that don’t need to be compromised.
You should get past the awkward roadblock in your head and do that thing.
You should get someone to hold you to account for doing the things you really want to do.
Go on that trip you want to go on.
Be cringe.
Sing that karaoke. Do those dance moves. Write that blog post.
Ignore the people who might think you are cringe. They don’t really care that much.
Put on the cat ears you always wanted to wear.
And you will maybe befriend the people who are cringe in just the same ways as you.
Life may be very short. So make the next few years the best ones.
Live your life with whimsy.
I tend to err to much towards low confidence, but I would say this timeline something like 50% confidence interval. If i think about it, I could see it taking like ~10 years longer, depending on what threshold you want to use for more like 90% confidence conditioned on no AI Pause/moratorium. Emotionally the 1-4 year period feels most correct.
I don’t provide evidence for timelines here, I may describe what feels salient to me some at some another time, but other people have put much more effort into describing short timelines.
yeah skill issue ngl
I think noticing you are burnt out can be quite difficult if you’re not sure what it’s like. I felt real guilt at the possibility i could be burnt out, because of guilt on how many of my hours per week felt like i was doing work. I you are even holding the hypothesis, you should probably spend some time seriously considering it. It’s not that bad if you need to spend some time on a real actual break from what you are doing. It might not always feel as pressing from a distance. Other people are doing their own work too.
law of equal and opposite of advice applies: https://slatestarcodex.com/2014/03/24/should-you-reverse-any-advice-you-hear/
Exceptions for “I work in a field such as politics/law where reputation with normal people is extremely important”. If you’re not sure and just want to keep option value open, then this exception probably doesn't apply to you. And you still might be erring too much to reputation.
Additionally, think things might still turn out fine, so don’t do things that are reckless and put your life at risk in the short term. Avoid physically dangerous activities. Get your cryonics plan sorted. etc.