TLDR: Some billionaires support the EA/rationality communities. But I largely trust those communities based on my gut.

In my last post, I said that because effective altruist / rationalist writers seemed smart, I cautiously trusted many of their opinions. And I totally wasn’t brainwashed by them! But is my intuition reasonable evidence to justify my trust?[1] After all, if I cautiously trusted Qanon writers, that wouldn’t make it more likely that Pizzagate happened. 

I’ve trusted the wrong people before too. The only thing my last mentor / “spiritual advisor” (his words) pushed me to accomplish was drinking 11 shots in 4 minutes.[2] And I found him because I liked his blog post!

So who am I to assume it makes sense to trust the EA’s/rationalists? Shouldn’t I trust “successful people?”

Where Are All The Successful Rationalists And Effective Altruists?

Where are All the Successful Rationalists?, by Applied Divinity Studies (ADS), points out that it’s been 13 years since Eliezer Yudkowsky wrote the sequences (aka Rationality: From AI to Zombies).[3] And that Yudkowsky claimed, rationalists should “win!"

ADS doesn’t feel that rationalists have been successful. They claim the most successful rationalists they can think of are: Oxford philosophy professor Nick BostromGiveWell and Open Philanthropy founder Holden Karnofsky, Karnofsky’s GiveWell co-founder, Elie Hassenfeld, and effective altruism founder / Oxford philosophy professor Will MacAskill. And ADS says it feels like cheating to count them because their success is based on spreading rationality. 

(Every successful rationalist listed above is someone who I would consider an effective altruist. So I’m commenting on the article as if it’s aimed towards both communities.)

So does that indicate that rationalist advice doesn’t actually lead people to succeed? 

ADS thinks so. And I mostly agree. I wouldn’t expect any book, even any 1600-page series of books like the sequences, to reliably lead people to fame or fortune.[4] I wouldn’t expect lurking LessWrong or reading Astral Codex Ten to do that either.

However, I still think what I read is extremely important. Blog posts led me to move to San Francisco and got me into EA/rationality.[5]

So I think it’s worth thinking about whether reading EA/rationalist content provides me enough value.[6]

Finding The Successful EA’s/Rationalists

ADS’s article was published on September 5, 2020. At the time, the name Sam Bankman-Fried had only been mentioned three times on the EA Forum. And it had never been mentioned on LessWrong. I’d never heard of him either. But I imagine ADS and many more people know his name now. At only 30 years old, he’s quickly become the wealthiest person in the EA community. As of June 25, 2022, Forbes estimates his fortune at $20.4 billion. 

I don’t know how much reading about rationality has helped him. But the EA community seems to have influenced his decisions when he was in college. And many successful people appear to take EA/rationalist ideas seriously. I don’t know if Elon Musk counts as an effective altruist. (If so, he’s the richest person in the EA and every community.) But he did speak at an EA Global conference in 2015. And he partnered with effective altruism community member Igor Kurganov to donate Tesla stock valued at $5.7 billion in November 2021. (Nevermind.[7])

Likewise, Peter Thiel has spoken at EA events, and his foundation has donated to a rationalist organization. Dustin Moskovitz has historically been the biggest funder of EA organizations. Other billionaires who have donated to EA organizations include Vitalik Buterin, Jed McCaleb, and maybe Reid Hoffman[8].[9] And, as ADS mentions, Patrick Collison and Paul Graham are both fans of rationalist blogger Scott Alexander.

I could list many more successful people who seem to support the EA/rationalist communities.[10] And my impression is that while EA's/rationalists aren’t ruling the world, most of them have higher IQ's and/or incomes for someone their age than the average person in their country.[11]

All the people I’ve mentioned seem to have good financial judgment. They think it’s worth donating money to EA organizations and/or reading rationalist writers. That’s an indicator that EA’s/rationalists should be trusted.

How Successful Are The EA’s/Rationalists?

I may have massively understated my case to measure the success of the EA/rationalist communities. 

ADS’s article mentioned one reason there should be more wealthy EA’s/rationalists is that they promoted “earning to give” (i.e., making as much money as possible to donate to effective charities). But my impression is that the EA community started to deemphasize earning to give by the end of 2015.[12] And Ben Todd estimates there were only about 1000 people making their career decisions based on effective altruism then. One of those 1000 people, Sam Bankman-Fried, has earned more money than anyone 30 or younger in the world!

Because Bankman-Fried, Moskovitz, and other wealthy donors, seem able and willing to fund “impactful” organizations, my impression is that most prominent figures in the EA community, in a vacuum, encourage people to directly work to solve the most important problems in the world rather than earn to give. They’d generally say the most important cause is AI alignment.

The most invested people in EA seem to have listened to that message. I’ve met more people in the EA/rationalist communities aiming to become AI researchers than for-profit company founders.

That would be great if AI alignment truly is the biggest problem in the world and people are making progress on solving it. That would make those people “successful” to me. 

Conclusion

The support of the “successful people” I highlighted is a minor reason I trust the EA’s/rationalists. There are plenty of “successful people” I don’t trust. If Elon Musk didn’t care about AI alignment, I could’ve dismissed him as someone who’s not an AI expert. I could’ve pointed to this list of reasons to distrust Elon Musk. 

And if I didn’t want to value Peter Thiel’s opinion, I could’ve said I don’t support all of his donations. I could probably find a reason not to trust any “successful person” I mentioned.

I didn’t provide great evidence that these “successful people” should count as EA’s/rationalists either. Peter Thiel hasn’t donated to MIRI since at least 2015. Patrick Collison reads Scott Alexander, but he’s critical of effective altruism.[13]

Plus, I didn’t ask myself, why aren’t more successful people interested in EA/rationality? Or whether there are other communities with more “successful people”?

That didn’t matter much to me. I was into EA before I knew any of those billionaires had any association with EA/rationality. And I could’ve found a way to defend EA/rationality even if I couldn’t find any successful people supporting them. I could’ve said the community is still young. I could’ve said most billionaires don’t have "EA values." And, as I said, I don’t think EA’s/rationalists haven’t emphasized “earning to give”[14] since 2015.[15]

I would update my opinion about the value of EA/rationalist content if I could somehow find out that their advice generally led people to be more irrational. But I’d need to thoroughly analyze multiple studies making that claim or gather a ton of anecdotal evidence before believing that.

Why do I trust myself? Because I was a software engineer in Silicon Valley? Because I got a 32 when I took the ACT in 2008? Those would be minor reasons.

Ultimately, I trust my gut. I hope it rationally gathers and analyzes evidence.

(cross-posted from my blog: https://utilitymonster.substack.com/p/not-brainwashed-but-stupid)

  1. ^

    I’m referring to my trust that EA’s/rationalists do a good job trying to make accurate statements (i.e., epistemic trust). I confidently trust that EA’s/rationalists share many of my values.

  2. ^

    And there’s a tiny chance he married me to my StreetWars assassin.

  3. ^

    Yudkowsky wrote “the sequences,” a series of blog posts, from 2006-2009. These posts are essentially the closest thing to the rationalist bible.

  4. ^

    Yudkowsky acknowledges that he didn’t try to make the sequences especially accessible. I’ve listened to about ten posts from the sequences. I struggled to understand them thoroughly. The podcast I think I listened to seems to have been removed, but there’s another audio version available. I’d probably try reading them if I decide to give them another shot.

  5. ^

    Specifically, this article about the movie Her led me to find Medium. Medium led me to this article, which I liked enough to follow its author. That presumably led me to this article which led me to move to San Francisco and meet my friend Matt Kim. Matt sent me this article, and that site led me to learn about effective altruism.

  6. ^

    Granted, I think some of the value I get from EA/rationalist writing is entertainment value. But I mainly feel entertained because reading makes me feel smart. So if I didn’t think these posts were well-reasoned, I doubt I’d still be entertained by them enough to continue reading them.

  7. ^

    Musk recently tweeted that Will MacAskill’s book, What We Owe The Future, is a close match for his philosophy. So maybe he’ll donate to EA causes in the future.

    I’m adding this footnote on August 17, 2022. I may not continue to update this post to add news related to Musk and EA.

  8. ^

    I tried looking for 5-10 minutes to find another record of Hoffman’s donation.

  9. ^

    Jaan Tallinn may not be a billionaire, but he’s a notable EA donor too. And Moskovitz’s wife, Cari Tuna, and Bankman-Fried’s FTX cofounder Gary Wang also donate to EA orgs. Supposedly, there’s a secret EA billionaire too.

    I may not continue to update this post to add information about which billionaires donate to EA/rationalist orgs. It contains the info I have as of August 26, 2022.

  10. ^

    My impression is that Lincoln Quirk and Ben Kuhn from WaveEmerson SpartzJason MathenyLiv BoereeDan SmithMartin CrowleyTom Crowley, and Robin Hanson consider themselves part of the EA and/or rationalist communities. Conor White-Sullivan and Matt Yglesias have posted on the EA forum or LessWrong. Joseph Gordon Levitt has spoken at an EA conferenceBill GatesAndrew Yang, and Bryan Caplan have referred to effective altruism positively. Dominic CummingsTyler CowenEzra KleinSteven Pinker, and Arram Sabeti read rationalist content. And Nate SilverBen Thompson, and Matt Levine read Scott Alexander.

  11. ^

    That impression is based on my interpretation of the 2020 EA Survey and the 2019 Slate Star Codex reader survey. The EA survey chart may be misleading since most EA’s are in the U.S. and UK. And as of 2020, the median EA age is 27 and the mean is 29.

  12. ^

    That belief is based on this 80,000 Hours article and this Vox article about EA Global 2015. 80,000 Hours seems to play a huge role in getting people involved in effective altruism.

  13. ^

    I disagree with his criticism. There are plenty of EA efforts to work on things that are hard to quantify, such as progress on AI alignment.

  14. ^

    If Moskovitz hadn’t started donating money to EA, I’d bet earning to give would’ve been encouraged by the EA community for a longer period of time. It would’ve been interesting to see if that led to more “successes” like Sam Bankman-Fried.

  15. ^

    ADS wrote a post saying that “the rationalist community is young” isn’t a good excuse for their lack of success. But I’d rebut that by citing the points about values and earning to give that I just mentioned. And, as ADS said, the successful professor they cited reads rationalist blogger Alexey Guzey. So that’s evidence for the premise that it’s worth reading EA/rationalist content.

    Plus, ADS asks that if the average rationalist is 30, does that mean the average age of someone reading Eliezer Yudkowsky in 2007 was a teenager? I’d bet no. Most likely, younger people have joined the rationalist community since 2007. (EA seems to do a lot of outreach at colleges. I’d presume many people find the rationalist community through EA.) I imagine it takes time to go from reading rationality advice to consistently applying it in practice.

New Comment
8 comments, sorted by Click to highlight new comments since: Today at 8:19 PM

There are plenty of “successful people” I don’t trust.

Being successful is a necessary condition for trusting them, not a sufficient condition.

I had a problem understanding your thesis, and I am still not sure that I do. It feels like... uhm, let me express it by a modified version of Litany of Tarski:

If rich people approve of rationality, I desire to be rational.

If rich people disapprove of rationality, I desire to be irrational.

Let me not become attached to epistemic strategies that may be uncool.

I meant to convey that I was evaluating my trust in the rationalist community, not rationality itself. 

And I concluded that the opinions of really successful people is only a minor factor affecting my trust in the rationalist community.

How much gain do you think is actually available for someone who is still limited by human tissue brain performance and just uses the best available consistently winning method?

The consistently winning method with the best chance of success isn't very interesting.  It just means you go to the highest rated college you can get accepted to, you take the highest paying job from the highest tier company that you can get in at, you take low interest loans, you buy index funds as investments and stay away from lower gain assets like real estate, and so on.  

Over a lifetime most people who do this will accumulate a few million dollars in net worth and have jobs that pay ~10 times the median.  (~400k for an ML engineer with ~10 yoe or a specialist doctor)

I would argue that this, for most humans, is the most rational.  Better to have 75-90 percent of your futures be good ones than to have 5% of your futures be good ones, but your average income is far higher in the second case.  (from the few futures where you became a billionaire bringing up the average)

Extraordinary success - billions in net worth - generally requires someone to either be born with millions in seed capital (non rational) or to gamble it all on a long shot and get lucky (non rational) or both.  

Billionaire level success is non replicable.  It requires taking specific actions that in most futures would have failed during narrow and limited windows of opportunity, and having had the option to take those actions at all.  This is essentially the definition of luck.

I'd say it's rational to maximize expected utility. The small probability of an enormous success could outweigh the larger probability of a failure that won't ruin your life.

How much gain do you think is actually available for someone who is still limited by human tissue brain performance and just uses the best available consistently winning method?

Quite a bit.

https://www.gwern.net/on-really-trying

I said that because effective altruist / rationalist writers seemed smart, I cautiously trusted many of their opinions.

Just generally smart? Why not trust people with qualifications and experience in specific fields?

I was paraphrasing. I agree it makes sense to trust people when they're talking about things they seem to know more about.