Elizabeth

Wikitag Contributions

Comments

Sorted by

follow up: if you would disagree-vote with a react but not karma downvote, you can use the opposite react. 

While we're at it, can it be >99% to match <1%?

As a follow up on my previous poll: If you've worked closely with someone who used stimulants sometimes but not always, how did stimulants affect their ability to update? Please reply with emojis <1% for "completely trashed", 50% for neutral, >99% for "huge improvement".  

Comments with additional details are welcome. 

Reply322221111

Sorry, I missed this too. The first-pass transcript was indeed done by AI. I went over it probably dozens of times, but I guess not enough.

Can you share data on the size of PauseAI protests over time?

 

https://www.youtube.com/shorts/eUxJZk6niBI

Elizabeth278

Note that at time of donation, Altman was co-chair of the board but 2 years away from becoming CEO. 

Elizabeth123

Reasoning through a new example:

There's no google maps and no internet to help with finding a hotel. You haven't chosen a destination city yet. 

You could work out how to choose hotels and facilitate the group identifying the kind of hotel it wants. They're both robustly useful. 

You could start picking out hotels in cities at random. Somehow my intuition is that doing this when you don't know the city is still marginally useful (you might choose that city. Obviously more useful the smaller the set of possible cities), but nonzero useful. 

OTOH, one of the best ways to build hotel identifying skills is to identify a hotel, even if you don't use it. A few practice runs choosing hotels in random cities probably does help you make a new reservation in a different city. 

My shoulder John says "dry running hotels is a fine thing to do as long as you're doing it as a part of a plan to get good at a generalizable skill". I agree that's ideal, but not everyone has that skill, and one of the ways to get it is to gradient ascend on gradient ascending.  I worry that rhetoric like this, and related stuff I see in EA and rationality encouraging people to do the most important thing, ends up paralyzing people when what they need is to do anything so they can start iterating on it.

One possible reason: bouncing off early > putting in a lot of effort and realizing you'll still never get traction > being kicked out. Giving people false hope hurts them.

I don't think you should never help out a new person, but I reserve it for people with very specific flaws in otherwise great posts. 

Load More