All of Eric Raymond's Comments + Replies

Against "Context-Free Integrity"

Terminological point: I don't think you can properly describe your hypothetical rationalist in Stalinist Russia as "paranoid".  His belief that he is surrounded by what amounts to a conspiracy out to subjugate and destroy him is neither fixated nor delusional; it is quite correct, even if many of the conspiracy's members would choose to defect from it if they believed they could do so without endangering themselves.

I also note that my experience of living in the US since around 2014 has been quite similar in kind, if not yet in degree.  I pick ou... (read more)

2Ben Pace2dI had an interesting conversation with Zvi about in which societies it was easiest to figure out whether the major societal narratives were false. It seemed like there was only a few major global narratives in times back then, whereas today I feel like there’s a lot more narratives flying around me.
2Ben Pace2dThough, my point is that just like Moody, a person who is (correctly) constantly looking out for power-plays and traps, and will end up seeing many that aren’t there, because it’s a genuinely hard problem to figure out whether specific people are plotting against you.
Specializing in Problems We Don't Understand

Endorsed.  A lot of this article is strongly similar to an unfinished draft of mine about how to achieve breakthroughs on unsolved problems.

I'm not ready to publish the entire draft yet, but I will add one effective heuristic.  When tackling an unsolved problem, try to model how other people are likely to have attacked it and then avoid those approaches.  If they worked, someone else would probably have achieved success with them before you came along. 

Rationalism before the Sequences

To be fair, I haven't followed Less Wrong all that closely over the years. It's more that I've known some of the key people for a while, notably Eliezer himself and Scott Alexander.

Rationalism before the Sequences

It seems to me that you've been taking your model of predictivism from people who need to read some Kripke. In Peirce's predictivism,  to assert that a statement is meaningful is precisely to assert that you have a truth condition for it, but that doesn't mean you necessarily have the capability to test the condition.

Consider Russell's teapot.  "A teapot orbits between Earth and Mars" is a truth claim that must unambiguously have a true or false value.  There is a truth condition on on it; if you build sufficiently powerful telescopes and pe... (read more)

5Eliezer Yudkowsky2dJust jaunt superquantumly to another quantum world instead of superluminally to an unobservable galaxy. What about these two physically impossible counterfactuals is less than perfectly isomorphic? Except for some mere ease of false-to-fact visualization inside a human imagination that finds it easier to track nonexistent imaginary Newtonian billiard balls than existent quantum clouds of amplitude, with the latter case, in reality, covering both unobservable galaxies distant in space and unobservable galaxies distant in phase space.
Rationalism before the Sequences

Eliezer was more influenced by probability theory, I by analytic philosophy, yes.  These variations are to be expected.  I'm reading Jaynes now and finding him quite wonderful.  I was a mathematician at one time, so that book is almost comfort food for me - part of the fun is running across old friends expressed in his slightly eccentric language.

I already had a pretty firm grasp on Feynman's "first-principles approach to reasoning" by the time I read his autobiographical stuff.  So I enjoyed the books a lot, but more along the lines of... (read more)

Rationalism before the Sequences

I have run across Bucky Fuller of course.  Often brilliant, occasionally cranky, geodesic domes turned out to suck because you can't seal all those joints well enough.  We could use more like him.

Rationalism before the Sequences

Great Mambo Chicken and Engines of Creation were in my reference list for a while, until I decided to cull the list for more direct relevance to systems of training for rationality.  It was threatening to get unmanageably long otherwise. 

I didn't know there was a biography of Korzybski.  Thanks!

Rationalism before the Sequences

 "Galaxies continue to exist after the expanding universe carries them over the horizon of observation from us" trivially unpacks to  "If we had methods to make observations outside our light cone, we would pick up the signatures that galaxies after the expanding universe has carried them over the horizon of observation from us defined by c."

You say "Any meaningful belief has a truth-condition".  This is exactly Peirce's 1878 insight about the meaning of truth claims, expressed in slightly different language - after all, your "truth-conditio... (read more)

I reiterate the galaxy example; saying that you could counterfactually make an observation by violating physical law is not the same as saying that something's meaning cashes out to anticipated experiences.  Consider the (exact) analogy between believing that galaxies exist after they go over the horizon, and that other quantum worlds go on existing after we decohere them away from us by observing ourselves being inside only one of them.  Predictivism is exactly the sort of ground on which some people have tried to claim that MWI isn't meaningful... (read more)

Mythic Mode

The reference to the Book of the Law was intentional.  The reference to chaos magic was not, as that concept had yet to be formulated when I wrote the essay - at least, not out where I could see it.

I myself do not use psychoactives for magical purposes; I've never found it necessary and consider them a rather blunt and chancy instrument.  I do occasionally take armodafinil for the nootropic effect, but that is very recent and long postdates the essay.

Rationalism before the Sequences

Probably, but there is something else more subtle.

Both the cultures you're pointing at are, essentially, engines to support achieving right mindset. It's not quite the same right mindset, but in either case you have to detach for "normal" thinking and its unquestioned assumptions in order to be efficient at the task around which the culture is focused.

Thus, in both cultures there's a kind of implicit mysticism.  If you recoil from that word because you associate it with anti-rationality I can't really blame you, but I ask you to consider the idea of m... (read more)

Rationalism before the Sequences

I think a collection of examples and analysis would be a post in itself.

But I can give you one suggestive example from Twelve Virtues itself: "If you speak overmuch of the Way you will not attain it."

It is a Zen idea that the essence of enlightenment cannot be discovered by talking about enlightenment; rather one must put one's mind in the state where enlightenment is.  Moreover, talk and chatter - even about Zen itself - drives that state away.

Eliezer is trying to say here that the the center of rationalist practice is not in what you know about rati... (read more)

Rationalism before the Sequences

I actually wouldn't call Zen a "central theme".  More "a recurring rhetorical device".  It's not Zen Buddhist content that the Sequences use, it's the emulation of Zen rhetoric as a device to subtly shift the reader's mental stance. 

5gilch17dNot being an expert in Zen, I'm not sure what "Zen rhetoric" means. Could you provide examples quoted from the Sequences of what you are talking about and what makes it "Zen"?
2adamzerner17dI see. Thanks for clarifying.
Eric Raymond's Shortform

I described myself as a subject-matter expert in epistemology.  That means I'm familiar with the branch of philosophy that considers the maintenance and justification of knowledge. and considers different theories of same.

Since you're using the name 'metatroll', I think I'll leave it at that. 

Eric Raymond's Shortform

I know who Deutsch is, and I'd never even heard that he had a movement around him.

Which is relevant.  I've had my ear to the ground for interesting rationality training since, oh, 1975 or so, and I definitely run in the right circles to pick up rumors of stuff like this.  The fact that your report is my first sign for that crew is from my POV pretty good evidence that its impact was very, very low.

I also question some of your other premises.  Speaking as a person who approaches the Yudkowskian reform from a perspective formed by a previous r... (read more)

1metatroll17dHow do you know that you're good at knowing?
Rationalism before the Sequences

There's a technical problem.  My blog is currently frozen due to a stuck database server; I'm trying to rehost it.  But I agree to your plan in principle and will discuss it with you when the blog is back up.

3Ben Pace18dSounds good.
Rationalism before the Sequences

Heh. Come to think of it from that angle, "a bit true, but not really" would have been exactly my assessment if I were in your shoes. Thanks, I appreciate the nuanced judgment. 

Eric Raymond's Shortform

Since you've mentioned Rootless Root, I will say that there is another essay I am now thinking of writing about the playful use of Zen tropes.  The rationalist community and the hacker culture both have strong traditions of this sort of play...but, the functional reasons for the tradition are not the same!  And the way they differ is interesting.

That's enough of a teaser for now. :-)

Rationalism before the Sequences

I don't really have an interesting answer, I'm afraid. Busy life, lots of other things to pay attention to, never got around to it before.

Now that I've got the idea, I may re-post some rationality-adjacent stuff from my personal blog here so the LW crowd can know it exists.

The way I have set this up for writers in the past has been to setup crossposting from an RSS feed under a tag (e.g. crossposting all posts tagged 'lesswrong').

I spent a minute trying and failed to figure out how to make an RSS feed from your blog under a single category. But if you have such an rss feed, and you make a category like 'lesswrong' then I'll set up a simple crosspost, and hopefully save you a little time in expectation. This will work if you add the category old posts as well as new ones.

4David Hornbein18dPlease do.
Mythic Mode

Author of "Dancing with the Gods" checks in.

First, to confirm that you have correctly understood the points I was trying to make. I intended "Dancing with the Gods" to be a rationalist essay, in the strictest Yudkowskian-reformation sense of the term "rationalist", even though the beginnings of the reformation were seven years in the future when I wrote it. 

<insert timeless-decision-theory joke here>

Second, that I 100% agree with your analysis of why "Meditations on Moloch" was important.

Third and most importantly, to say that I like your use of... (read more)

Rationalism before the Sequences

You have an outside view of my writing, so I'm curious. On a scale of 0 = "But of course" to 5 = "Wow, that was out of left field", how surprising did you find it that I would write this essay?

If you can find anything more specific to say along these lines (why it's surprising/unsurprising) I would find that interesting.

3madasario9d0 or 1. I saw this post and thought "finally! I've been a fan since the early 90's. I'm most surprised that it took you this long, and excited that you finally got around to it. :) The ratsphere is ripe for some of the same treatment you gave the fossphere back in the day. (It's under attack by forces of darkness; it's adherents tend to be timid and poorly funded while its attackers are loud, charismatic, and throw a lot of money around; it revolves around a few centers of gravity ("projects") that are fundamental building blocks of the future - the Big Problems; etc.) I haven't thought this through a ton, but if I squint a bit I can see Jaynes &etc filling the role of, like, Knuth and K&R and etc - hard engineering; and The Sequences/LW/SSC filling the role of, say, GNU and Lions and etc - a way for the masses to participate and contribute and absorb knowlege and gel into a tribe and a movement. I paint that vague hand-wavy picture for you, hoping you'll understand when I say that this post feels like it should be expanded and become TAOUP but for the ratsphere.
3Zian12d3 My knowledge before reading the article and comments could be summarized as : * These are some really great articles by ESR. I wonder why no one had taken them super seriously yet... * somewhat of an outsider perspective as FeepingCreature described * I wonder why some people have such strong opinions about this person
2philh15dFor me, like 1 maybe 2? (That you would write it; it's a little more surprising that you did.) I knew you'd read at least some of the sequences because I think I first found them through you, and I think you've called yourself a "fellow traveler". Oh, and I remember you liked HPMOR. But I didn't know if you were particularly aware of the community here.
6Alexei18d5 for me. I read Dancing with Gods a long time ago and it’s very memorable. But had no idea about anything else.
2lincolnquirk18dHmm, maybe a 2. I didn’t know you had read the Sequences, but it seems like the sort of thing that would appeal to you based on the writing in Dancing, etc.

I was slightly surprised, mostly because I had the expectation that if you've known about LW for a while, then I would have thought that you'd end up contributing either early or not at all. Curious what caused it to happen in 2021 in particular.

Rationalism before the Sequences

Ironically, I disagree a bit with lukeprog here - one of the few flaws I think I detect in the Sequences is due to Eliezer not having read enough philosophy.  He does arrive at a predictivist theory of confirmation eventually, but it takes more effort and gear-grinding than it would have if he had understood Peirce's 1878  demonstration and expressed it in clearer language.

Ah well.  It's a minor flaw.

5Dagon18dJust a note of thanks, for the essay (which I skimmed, and will read more thoroughly when I have more time), but more for all of your writing (and direct activity) regarding hacker culture. I hadn't really made the connection in my mind between the different domains of rational/skeptical/hacker thought until this - I'm between you and Eliezer in age, and have considered myself a hacker since the mid-80s, having read a different subset of historical thought - light on philosophy, very heavy on the SF that everyone references, but also Knuth and Hofstadter and Dijkstra which mixed philosophy of thinking with rigor of procedural execution. Anyway, thanks for this! And for any other readers who aren't familiar with your work, check out the Rootless Root at http://www.catb.org/~esr/writings/unix-koans/ [http://www.catb.org/~esr/writings/unix-koans/] .
Eric Raymond's Shortform

Alas, I can't give you a sweeping history of a bunch of movements and factions.  The last group really comparable to today's rationalist movement was the community around Alfred Korzybski's General Semantics.  My essay will talk about them.

What is now being mulled over by my beta readers is somewhat more personal and depends on the premise that my experience was representative of a lot of 20th-century proto-rationalists, including in particular Eliezer.  Fortunately I don't have to handwave this; there's reasonably good evidence that it's true, some of which is indicated in the essay itself.

1TAG20dI depends on how you define This Sort of Thing , or rationalist/sceptic movements in general. If you use a definition along the lines of:- • Being science-orientated , but having much more specific claims than "science good" • Being largely outside of mainstream academia etc • Being an insular group that mostly talk to each other • Having difficulty in communicating with outsiders , in any case, because their own theories are expressed in a novel jargon. • Centering on a charasmatic leader, with a set of mandatory writings • Having an immodest epistemology..which claims to be able to solve just about any problem.. • ..which is based on a small number of Weird Tricks. ...then you would need to include David Deutsch's followers as well..the Fabric of Reality was published in the 90s. And this iteration of rationality borrows Deutsch's argument for the many world's interpretation, just as it borrows Korzybski's map-territory distinction.
Eric Raymond's Shortform

I have a draft I'm fairly pleased with.  Has gone out to some beta readers. 

Eric Raymond's Shortform

Now I'm laughing, because looking through those explicit lists I am finding pretty much all of the two dozen or so sources I expected to find based on various hints and callbacks. Almost all of them books very familiar to me as well.

Yes, this essay is going to be fun to write.

Eric Raymond's Shortform

Broader history, focusing on certain important developments in the 20th century.

Eric Raymond's Shortform

I actually have not seen such a bibliography, though I could infer a lot from his language choices in essays like Twelve Virtues.  Can you share a pointer to his list of forerunners?  

I don't expect there is much on it that will surprise me, but I would very much like to read it nevertheless.

9Ben Pace22dHere’s a few quick links where Eliezer talks about books that influenced him, or makes book recommendations. This isn’t a precise response to the initial prompt of ‘forerunners’, just what turned up when I searched for Eliezer talking about books he‘s read and recommends. * For rationality reading, he lists some books in a comment here [https://www.lesswrong.com/posts/RiQYixgCdvd8eWsjg/recommended-rationalist-reading?commentId=x2RBYhYC3PZKKAJXD] . Standout books were E.T. Jaynes’ “Probability Theory: The Logic of Science”, Judea Pearl’s “Probabilistic Reasoning in Intelligent Systems”, and the collected papers of Kahneman & Tversky (and colleagues). He further mentions his feelings on Jaynes here [https://www.lesswrong.com/posts/kXSETKZ3X9oidMozA/the-level-above-mine]. * Eliezer has a (pretty old) bookshelf here [http://web.archive.org/web/20200217172048/https://yudkowsky.net/obsolete/bookshelf.html] of books that had a fair bit of impact on him (which he wrote when he was 20, so a whole decade before LW). He also lists “Good Idealistic Books [https://www.lesswrong.com/posts/YdXMZX5HbZTvvNy84/good-idealistic-books-are-rare] ” that he read in his youth. * Eliezer has talked about rationalist fiction he has read in the past here [https://www.lesswrong.com/posts/q79vYjHAE9KHcAjSs/rationalist-fiction]. In this post [https://www.lesswrong.com/posts/YicoiQurNBxSp7a65/is-clickbait-destroying-our-general-intelligence] he mentions that he was heavily influenced by his parents collection of old science fiction. Recently on Twitter he listed more [https://twitter.com/EpistemicHope/status/1367352081095348227].
Open, Free, Safe: Choose Two

I agree that the distinction you pose is important. Or should be.  I remember when we could rely on it more than we can today.

Unfortunately, one of the tactics of people gaming against freedom is to deliberately expand the definition of "interpersonal attack" in order to suppress ideas they dislike. We have reached the point where, for example: 

  1. The use/mention distinction with respect to certain taboo words is deliberately ignored, so that a mention is deliberately conflated with use and use is deliberately conflated with attack.
  2. Posting a link to
... (read more)
3FireStormOOO8dI've noticed in consistently good moderation that resists this kind of trolling/power game: Making drama for the sake of it, even with a pretense, is usually regarded as a more severe infraction that any rudeness or personal attack in the first place. Creating extra work for the moderation team is frowned upon (don't feed the trolls). Punish every escalation and provocation, not just the first in the thread. Escalating conflicts and starting flamewars is a seen as more toxic than any specific mildly/moderately offensive post. Starting stuff repeatedly, especially with multiple different people is a fast ticket to a permaban. Anyone consistently and obviously lowering the quality of discussions needs to be removed ASAP.
1Archimedes19dAs long as people are dishonestly gaming the system, there will always be problems and there is no silver bullet solution. It's a fundamentally hard problem of balancing competing values. Any model proposed will have failings. The best we can do is to try to balance competing values appropriately for each individual platform. Each one will have different tilts but I doubt rejecting safety entirely is likely to be a good idea in most cases. It's often tempting to idealize one particular value or another but when any particular value is taken to an extreme, the others suffer greatly. If you can back away from a pure ideal in any dimension, then the overall result tends to be more functional and robust, though never perfect.
Eric Raymond's Shortform

I'm considering writing, as a first post, a reflection on "Rationality Before The Sequences": some history on what the public project of less-wrongness looked like before Eliezer's heroic attempt at systematization.

This is a probe to discover if there would be significant interest in such an essay.

4Viliam21dThis has a potential to be extremely cool. Remembering that the ideology is not the movement [https://slatestarcodex.com/2016/04/04/the-ideology-is-not-the-movement/], I would be interested in reading a history of both the ideas and the groups of people that formed around them. Especially, where are those parallel branches today, what have they achieved, what are they currently working on. What is the reference group for the rationalist community, and what is their typical outcome? (My current guess is: "a strong personality becomes popular, gets many fans and publishes many books, coins one or two generally known idioms... then dies, and the fans keep talking about how awesome those books were, and that's all.")

I remember David Stove's What Is Wrong With Our Thoughts (1991) being discussed on early-LW, and being influenced by it. I don't really know whether this was an outlier-unusually-good essay, or the tip of an iceberg of good pre-LW less-wrongness.

6Ben Pace22dI like getting a better understanding of our historical context, and I sort of expect you’d point me to some pretty interesting things I’ve not heard of before and could go read. And could convey to me a sense of times you know about better than I. So I’d be interested. I don’t know if you mean describing how the world looked in 2006, or if you’re thinking of a broader history of the effort. I personally like to think of us as continuing the work laid out by Francis Bacon [https://www.lesswrong.com/posts/MYNNvgxSYmK7f4JBt/novum-organum-preface-1] and the Royal Society from 1660, and taking much of the spirit from great scientists along the way like Feynman and such.
2philh22dThis is an indication of interest in such.
1antanaclasis22dIt would definitely be neat to read a history of that sort. Having myself not read many of the books that Eliezer references as forerunners, that area of history is one that I at least would like to learn more about.
Open, Free, Safe: Choose Two

I agree with the reasoning in this essay.

Taken a bit further, however, it explains why valuing "safety" is extremely dangerous - so dangerous that, in fact, online communities should consciously reject it as a goal.

The problem is that when you make "safety" a goal, you run a very high risk of handing control of your community to the loudest and most insistent performers of offendedness and indignation. 

This failure mode might be manageable if the erosion of freedom by safetyism were still merely an accidental and universally regretted effect of trying... (read more)

3Archimedes1moI think there is an important distinction between being "safe" from ideas and "safe" from interpersonal attacks. In an online space, I expect moderation to control more for the latter than the former, protecting not against wrongspeech so much as various forms of harassment. Rational discourse is rarely possible in an environment that is not protected by moderation or at least shared norms of appropriate communication (this protection tends to be correlated with "openness"). Having free speech on a particular platform is rarely useful if it's drowned out by toxicity. I support strong freedom of ideas (within the bounds of topicality) but when the expression of those ideas is in bad faith, there is no great value in protecting that particular form of expression. There is a hypothesis that that unconstrained speech and debate will lead to wrong concepts fading away and the less wrong concepts rising to more common acceptance but internet history suggests that this ideal can't last for long in a truly free space unless that freedom is never actually tested. As soon as bad actors are involved, you either have to restrict freedom or else experience a degradation in discourse (or both). If safety is not considered, then a platform effectively operates at the level of its worst users.
8jaspax1moThis is kind of true, but taken seriously it only leaves "freedom" as an achievable goal, which I don't think is right. I didn't say much about it because it seems to me that this kind of weaponized safety is not a general feature of online communities, but rather a feature particular to the present moment, and the correct solution is on the openness axis: don't let safetyists into your community, and kick them out quickly once they show their colors. Also, the support for "safety" among these people is more on the level of slogan than actual practice. My experience is that groups which place a high priority on this version of "safety" are in fact riven with drama and strife. If you prioritise actual safety and not just the slogan, you'll find you still have to kick out the people who constantly hide behind their version of "safety".
In defence of epistemic modesty

I think this is utterly horrible advice.

I have blogged a detailed response at Against modesty, and for the Fischer set.

6Thrasymachus3ySorry you disliked the post so much. But you might have liked it more had you looked at the bit titled 'community benefits to immodesty', where I talk about the benefits of people striking out outside expert consensus (but even if they should act 'as if' their contra-expert take was correct, they should nonetheless defer to it for 'all things considered' views).