I was actually taking allergy pills because they help with my asthma symptoms, and nasal sprays don't seem to help me. I started taking Singulair / Montelukast about halfway through when I was taking Zyrtec every day, and it seems to be more effective without the sleep-related issues (although the combination was still more effective for my asthma than either alone).
But yeah, for normal allergy symptoms, focusing the medicine on your nose instead of your entire blood stream is also a good idea. Theoretically there's even an antihistamine nose spray now (Astepro) although I haven't tried it.
I don't see anything about Buddhist monks or stop lights in the original article? I think you might be doing the thing where there's an argument for this inside of your head but you're not providing it to us.
It seems like your meanings are just based on Western culture. I would have expected a post about universal meanings to show how cultures pre-Western-influence used the same colors in the same ways.
It also seems like you haven't provided evidence for most of your meanings?
You're putting a lot of importance on the "likely" here:
Superintelligence is likely to arrive within the next 20 years
What if it doesn't? Reaching retirement age with no savings sucks pretty badly (especially if you're used to spending your entire income). Short AI timelines might push you in the direction of smaller retirement savings (taking a risk of things being a little worse in timelines you think are unlikely), but probably not all the way to "don't save for retirement". You should also put at least some weight on AGI happening and somehow the world still existing in a recognizable form (since this is what has happened every other time a world-changing technology was created).
Something else to consider is that retirement isn't the only reason to have savings. If you think AI timelines are short, you might want to have a giant pile of money you can strategically deploy to do things like quit your job and work on alignment for a few years (or pay someone else to).
Could you give more info about how and when you use this? Do you just use the sleep setting and the wakeup settings, or do you also use it during the day?
What intensity did you find worked well for you starting and what did that go down to now?
I want to try this out but I'm not sure how to replicate what you're doing
I already emailed you about this but it might be useful to share here too for feedback.
I think it would be bad to use a technology if:
I don't think (1) is currently the case with ChatGPT, and I think people like Eliezer agree.
I think using ChatGPT, even the paid version, won't increase the resources going into AI capabilities because the big AI companies aren't funding constrained. If OpenAI needed another billion dollars, they'd just sell a billion dollars worth of stock. My $20 per month probably increases the price people would pay for that stock, but that just reduces the dilution existing shareholders face and has no real effect on their ability to get funding if they need it.
I might feel differently if my usage increased other people's usage of ChatGPT (although I also think hype is so high it would be very difficult to meaningfully increase it), if I was part of a big enough coalition that our boycott was meaningful and noticable, or if I was using it at a scale where the money paid to OpenAI was significant (I would consider free-riding with open source models to avoid funding capabilities).
I am trying to think of something that's banned purely for having existential risk and coming up blank.
Weren't CFC's banned for existential reasons (although only after an alternative was found, because it would be better to die than not have refrigerators..)?
Isn't doing well with technological unemployment really easy if you have a good job now, through the magic of investing? The hard part would be figuring out what to do if you're currently in a low-paying job, but I doubt that's common here.
Accidental Terraforming is neutral, an effect that human civilization has had upon the planet of our birth. It isn’t necessarily good or bad, but it does require careful thought and inquiry to determine if this is the course we want to set.
I don't think this is neutral though. I agree that the Earth's feelings about the matter are irrelevant, but terraforming means making a planet more like Earth / more habitable for humans, but climate change is making the planet less like the Earth we're used to and plausibly less habitable for humans. Something like Venusforming makes it more clear that it's pushing the climate in a direction we may not want to go.
Isn't 99.9% confident in this pretty extreme? If a thousand of randomly selected similar cases came out, 999 of them would be lab leaks and only one would involve investigators missing something?