aarongertler

Comments

For Better Commenting, Avoid PONDS

Would you be interested in crossposting this to the EA Forum? I think your points are equally relevant for those discussions, and I'd be interested to see how posters there would react.

As a mod, I could also save you some time by crossposting it under your account. Let me know if that would be helpful!

aarongertler's Shortform

Epistemic status: Neither unique nor surprising, but something I felt like idly cataloguing.

An interesting example of statistical illiteracy in the field: This complaint thread about the shuffling algorithm on Magic: the Gathering Arena, a digital version of the card game. Thousands of unique players seem to be represented here.

MTG players who want to win games have a strong incentive to understand basic statistics. Players like Frank Karsten have been working for years to explain the math behind good deckbuilding. And yet, the "rigged shuffler" is a persistent belief even among reasonably engaged players; I've seen quite a few people try to promote it on my stream, which is not at all aimed at beginners.

(The shuffler is, of course, appropriately random, save for some "hand smoothing" in best-of-one matches to increase the chance of a "normal" draw.)

A few quotes from the thread:

How is that no matter how many people are playing the game, or how strong your deck is, or how great your skill level, I bet your winning percentage is 30% or less. This defies the laws of probability.

(No one ever seems to think the shuffler is rigged in their favor.)

As I mentioned in a prior post you never see these problems when they broadcast a live tournament.

(People who play in live tournaments are much better at deckbuilding, leading to fewer bad draws. Still, one recent major tournament was infamously decided by a player's atrocious draw in the last game of the finals.)

In the real world, land draw will not happens as frequent as every turns for 3 times or more. Or less than 2 to 3 turns, not drawing a land

(Many people have only played MTG as a paper game when they come to Arena. In paper, it's very common for people to "cheat" when shuffling by sorting their initial deck in a particular way, even with innocuous intent. When people are exposed to true randomness, they often can't tolerate it.)

Other common conspiracy theories about Arena:

  • "Rigged matchmaking" (the idea that the developers somehow know which decks will be good against your deck, and ensure that you are matched up against it; again, I never see this theory in reverse)
  • "Poker hands" (the idea that people get multiple copies of a card more often than would be expected)
  • "50% bias" (the idea that the game arranges good/bad draws to keep players at a 50% win rate; admirably, these players recognize that they do draw well sometimes, but they don't understand what it means to be in the middle of a binomial distribution)
How do you evaluate whether a $500 donation to a project that you know well is a good idea?
  1. Consider cross-posting this question to the EA Forum; discussion there is more focused on giving, so you might get a broader set of answers.
  2. Another frame around this question: "How can one go about evaluating the impact of a year's worth of ~$500 donations?" If you're trying to get leverage with small donations, you might expect a VC-like set of returns, where you can't detect much impact from most donations but occasionally see a case of really obvious impact. If you spend an entire year making, say, a dozen such donations, and none of them make a really obvious impact, this is a sign that you either aren't having much impact or don't have a good way to measure it (in either case, it's good to rethink your giving strategy).
  3. You could also try making predictions -- "I predict that X will happen if I give/don't give" -- and then following up a few months later. What you learn will depend on what you predict, but you'll at least be able to learn more about whether your donations are doing what you expect them to do.
EA Kansas City planning meetup, discussion & open questions

Hey there!

You might want to post this over on the Effective Altruism Forum, which is built with the same structure as LessWrong but is focused entirely on EA questions (both about ways to do good and about community-building work like that of EA KC). I'm a moderator on that forum, and I think folks over there will be happy to help with your questions about organizing a group.

Tips on how to promote effective altruism effectively? Less talk, more action.

Edit: I see that you also asked this question on r/EffectiveAltruism. I like all the links people shared on that post!

How best to grow the EA movement is a complex question that many people have been working on for a long time. There's also a lot of research on various aspects of social movement growth (though less that's EA-specific).

I don't have the bandwidth to send a lot of relevant materials now, but I'd recommend you post your question on the EA Forum (which is built for questions like this), where you're more likely to get answers from people involved in community work.

To give a brief summary of one important factor: While the basic principles of EA aren't difficult to convey persuasively, there's a big gap between "being persuaded that EA sounds like a good thing" and "making large donations to effective charities" or "changing one's career". As part of my job at the Centre for Effective Altruism, I track mentions of EA on Twitter and Reddit, and it's very frequent to see people citing "effective altruism" as the reason that they give to (for example) their local animal shelter. EA is already something of a buzzword in the business and charitable communities, and trying to promote it to broad audiences runs the risk of the term separating even further from its intended meaning.

...but of course, this is far from the full story.

(If you do post this to the Forum, I'll write an answer with more detail and more ideas, but I'd prefer to wait until I think my response will be seen by more people focused on EA work, so that they can correct me/add to my thoughts.)

FB/Discord Style Reacts

I don't think I've seen this point made in the discussion so far, so I'll note it here: Anonymous downvotes (without explanation) are frustrating, and I suspect that anonymous negative reacts would be even worse. It's one thing if someone downvotes a post I thought was great with no explanation -- trolls exist, maybe they just disagreed, whatever, nothing I can do but ignore it. If they leave an "unclear" react, I can't ignore that nearly as easily -- wait, which point was unclear? What are other people potentially missing that I meant to convey? Come back, anon!

(This doesn't overshadow the value of reacts, which I think would be positive on the whole, but I'd love to see Slashdot-style encouragement for people to share their reasoning.)

The Forces of Blandness and the Disagreeable Majority
The growth of lots and lots of outlets for more “unofficial” or “raw” self-expression — blogs, yes, but before that cable TV and satellite radio, and long before that, the culture of “journalism” in 18th century America where every guy with a printing press could publish a “newspaper” full of opinions and scurrilous insults  — tends to go along with more rudeness, more cursing, more sexual explicitness, more political extremism in all directions, more “trashy” or “lowest common denominator” media, more misinformation and “dumbing down”, but also some innovative/intellectual “niche” media.
Chaos is a centrifugal force; it increases the chance of any unexpected outcome. Good things, bad things, existential threats, brilliant ideas, and a lot of weird, gross, and disturbing stuff.

The idea of an "anti-chaos elite" sounds fairly accurate to me, and it shows up a lot in the work of Thaddeus Russell, who wrote a book about American elites' history of stamping out rude/chaotic behavior and runs a podcast where he interviews a wide range of people on the fringes of polite society (including libertarians, sex workers, anarchists, and weird people with no particular political affiliation). It's not perfect from an epistemic standpoint, but it's still worth a listen from anyone interested in this topic.

Does the EA community do "basic science" grants? How do I get one?

Looks like you already posted on the EA Forum, but in case anyone else spots this post and has the same question:

I'm an EA Forum moderator, and we welcome half-baked queries! Just like LessWrong, we have a "Questions" feature people can use when they want feedback/ideas from other people.

Load More