You might want to post this over on the Effective Altruism Forum, which is built with the same structure as LessWrong but is focused entirely on EA questions (both about ways to do good and about community-building work like that of EA KC). I'm a moderator on that forum, and I think folks over there will be happy to help with your questions about organizing a group.
Edit: I see that you also asked this question on r/EffectiveAltruism. I like all the links people shared on that post!
How best to grow the EA movement is a complex question that many people have been working on for a long time. There's also a lot of research on various aspects of social movement growth (though less that's EA-specific).
I don't have the bandwidth to send a lot of relevant materials now, but I'd recommend you post your question on the EA Forum (which is built for questions like this), where you're more likely to get answers from people involved in community work.
To give a brief summary of one important factor: While the basic principles of EA aren't difficult to convey persuasively, there's a big gap between "being persuaded that EA sounds like a good thing" and "making large donations to effective charities" or "changing one's career". As part of my job at the Centre for Effective Altruism, I track mentions of EA on Twitter and Reddit, and it's very frequent to see people citing "effective altruism" as the reason that they give to (for example) their local animal shelter. EA is already something of a buzzword in the business and charitable communities, and trying to promote it to broad audiences runs the risk of the term separating even further from its intended meaning.
...but of course, this is far from the full story.
(If you do post this to the Forum, I'll write an answer with more detail and more ideas, but I'd prefer to wait until I think my response will be seen by more people focused on EA work, so that they can correct me/add to my thoughts.)
I don't think I've seen this point made in the discussion so far, so I'll note it here: Anonymous downvotes (without explanation) are frustrating, and I suspect that anonymous negative reacts would be even worse. It's one thing if someone downvotes a post I thought was great with no explanation -- trolls exist, maybe they just disagreed, whatever, nothing I can do but ignore it. If they leave an "unclear" react, I can't ignore that nearly as easily -- wait, which point was unclear? What are other people potentially missing that I meant to convey? Come back, anon!
(This doesn't overshadow the value of reacts, which I think would be positive on the whole, but I'd love to see Slashdot-style encouragement for people to share their reasoning.)
The growth of lots and lots of outlets for more “unofficial” or “raw” self-expression — blogs, yes, but before that cable TV and satellite radio, and long before that, the culture of “journalism” in 18th century America where every guy with a printing press could publish a “newspaper” full of opinions and scurrilous insults — tends to go along with more rudeness, more cursing, more sexual explicitness, more political extremism in all directions, more “trashy” or “lowest common denominator” media, more misinformation and “dumbing down”, but also some innovative/intellectual “niche” media.
Chaos is a centrifugal force; it increases the chance of any unexpected outcome. Good things, bad things, existential threats, brilliant ideas, and a lot of weird, gross, and disturbing stuff.
The idea of an "anti-chaos elite" sounds fairly accurate to me, and it shows up a lot in the work of Thaddeus Russell, who wrote a book about American elites' history of stamping out rude/chaotic behavior and runs a podcast where he interviews a wide range of people on the fringes of polite society (including libertarians, sex workers, anarchists, and weird people with no particular political affiliation). It's not perfect from an epistemic standpoint, but it's still worth a listen from anyone interested in this topic.
Looks like you already posted on the EA Forum, but in case anyone else spots this post and has the same question:
I'm an EA Forum moderator, and we welcome half-baked queries! Just like LessWrong, we have a "Questions" feature people can use when they want feedback/ideas from other people.
Testing comment retraction and editing.
I have taken the survey.
Comment: "90% of humanity" seems a little high for "minimum viable existential risk". I'd think that 75% or so would likely be enough to stop us from getting back out of the hole (though the nature of the destruction could make a major difference here).
I took part in the Good Judgment Project, a giant prediction market study from Philip Tetlock (of "Foxes and Hedgehogs" theory). I also blogged about my results, and the heuristics I used to make bets:
I thought it might be of interest to a few people -- I originally learned that I could join the GJP from someone I met at CFAR.
I wrote a pair of essays (and a shorter summary of both) on heroic responsibility, and how it could serve as a strong counterpart to empathy as a one-two punch for making good moral decisions:
Seemed Less-Wrong-ish, though my "heroic responsibility" is written for a different audience than Eliezer's, and is a bit less harsh/powerful as a result.
This is the best article on EA and religion that I've seen so far, and uses selective Bible quotes to make points:
Of course, you can use selective Bible quotes to make nearly any point, so this probably won't work if framed as a counterargument. Perhaps you can just show it to your cofounders and ask what they think, as the beginning of a discussion about what God might want or what Christians owe to non-Christians.
But I second MattG's advice that leaving is probably advisable, particularly if the above goes nowhere.