Eliezer Yudkowsky


The Bayesian Conspiracy
Three Worlds Collide
Highly Advanced Epistemology 101 for Beginners
Inadequate Equilibria
The Craft and the Community
Challenging the Difficult
Yudkowsky's Coming of Age
Quantified Humanism
Value Theory
Load More (9/36)


My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

My autobiographical episodic memory is nowhere near good enough to answer this question, alas.

My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

I could be very wrong, but the story I currently have about this myself is that Vassar himself was a different and saner person before he used too much psychedelics. :( :( :(

My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

Okay, sure.  If what Scott says is true, and it matches my recollections of things I heard earlier - though I can attest to very little of it of my direct observation - then it seems like this post was written with knowledge of things that would make the overall story arc it showed, look very different, and those things were deliberately omitted.  This is more manipulation than I myself would personally consider okay to use in a situation like this one, though I am ever mindful of Automatic Norms and the privilege of being more verbally facile than others in which facts I can include but still make my own points.

My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

By way of narrowing down this sense, which I think I share, if it's the same sense: leaving out the information from Scott's comment about a MIRI-opposed person who is advocating psychedelic use and causing psychotic breaks in people, and particularly this person talks about MIRI's attempts to have any internal info compartments as a terrible dark symptom of greater social control that you need to jailbreak away from using psychedelics, and then those people have psychotic breaks - leaving out this info seems to be not something you'd do in a neutrally intended post written from a place of grave concern about community dynamics.  It's taking the Leverage affair and trying to use it to make a point, and only including the info that would make that point, and leaving out info that would distract from that point.  And I'm not going to posture like that's terribly bad inhuman behavior, but we can see it and it's okay to admit to ourselves that we see it.

And it's also okay for somebody to think that the original Leverage affair needed to be discussed on its own terms, and not be carefully reframed in exactly the right way to make a point about a higher-profile group the author wanted to discuss instead; or to think that Leverage did a clearly bad thing, and we need to have norms against that clearly bad thing and finish up on making those norms before it's proper for anyone to reframe the issue as really being about a less clear bad thing somewhere higher-profile; and then this post is going against that and it's okay for them to be unhappy about that part.

My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

I'm about ready to propose a group norm against having any subgroups or leaders who tell other people they should take psychedelics.  Maybe they have individually motivated uses - though I get the impression that this is, at best, a high-variance bet with significantly negative expectation.  But the track record of "rationalist-adjacent" subgroups that push the practice internally and would-be leaders who suggest to other people that they do them seems just way too bad.

I'm also about ready to propose a similar no-such-group policy on 'woo', tarot-reading, supernaturalism only oh no it's not really supernaturalism I'm just doing tarot readings as a way to help myself think, etc.  I still think it's not our community business to try to socially prohibit things like that on an individual level by exiling individuals like that from parties, I don't think we have or should have that kind of power over individual behaviors that neither pick pockets nor break legs.  But I think that when there's anything like a subgroup or a leader with those properties we need to be ready to say, "Yeah, that's not a group in good standing with the rest of us, don't go there."  This proposal is not mainly based on the advance theories by which you might suspect or guess that subgroups like that would end badly; it is motivated mainly by my sense of what the actual outcomes have been.

Since implicit subtext can also sometimes be bad for us in social situations, I should be explicit that concern about outcomes of psychedelic advocacy includes Michael Vassar, and concern on woo includes the alleged/reported events at Leverage.

My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

I affirm the correctness of Ben Pace's anecdote about what he recently heard someone tell me.

"How dare you think that you're better at meta-rationality than Eliezer Yudkowsky, do you think you're special" - is somebody trolling?  Have they never read anything I've written in my entire life?  Do they have no sense, even, of irony?  Yeah, sure, it's harder to be better at some things than me, sure, somebody might be skeptical about that, but then you ask for evidence or say "Good luck proving that to us all eventually!"  You don't be like, "Do you think you're special?"  What kind of bystander-killing argumentative superweapon is that?  What else would it prove?

I really don't know how I could make this any clearer.  I wrote a small book whose second half was about not doing exactly this.  I am left with a sense that I really went to some lengths to prevent this, I did what society demands of a person plus over 10,000% (most people never write any extended arguments against bad epistemology at all, and society doesn't hold that against them), I was not subtle.  At some point I have to acknowledge that other human beings are their own people and I cannot control everything they do - and I hope that others will also acknowledge that I cannot avert all the wrong thoughts that other people think, even if I try, because I sure did try.  A lot.  Over many years.  Aimed at that specific exact way of thinking.  People have their own wills, they are not my puppets, they are still not my puppets even if they have read some blog posts of mine or heard summaries from somebody else who once did; I have put in at least one hundred times the amount of effort that would be required, if any effort were required at all, to wash my hands of this way of thinking.

My experience at and around MIRI and CFAR (inspired by Zoe Curzi's writeup of experiences at Leverage)

Secrecy is not about good trustworthy people who get to have all the secrets versus bad untrustworthy people who don't get any.  This frame may itself be part of the problem; a frame like that makes it incredibly socially difficult to implement standard practices.

Covid 10/7: Steady as She Goes

Well, huh.  I wonder if that makes it time to go look at RadVac some more.

Load More