This is a cross-post from Putanumonit.com


It seems that most people haven’t had much trouble making up their minds about Jordan Peterson.

The psycho-philosophizing YouTube prophet rose to prominence for refusing to acquiesce to Bill C-16, a Canadian law mandating the use of preferred pronouns for transgender people. The sort of liberal who thinks that this law is a great idea slapped the alt-right transphobe label on Peterson and has been tweeting about how no one should listen to Peterson about anything. The sort of conservative who thinks that C-16 is the end of Western Civilization hailed Peterson as a heroic anti-PC crusader and has been breathlessly retweeting everything he says, with the implied #BooOutgroup.

As the sort of rationalist who googles laws before reacting to them, I assured myself that Peterson got the legal facts wrong: no one is actually getting dragged to jail for refusing to say zir. I’m going to use people’s preferred pronouns regardless, but I’m happy I get to keep doing it in the name of libertarianism and not-being-a-dick, rather than because of state coercion.

With that, I decided to ignore Peterson and go back to my media diet of rationalist blogs, Sam Harris, and EconTalk.

But Jordan Peterson turned out to be a very difficult man to ignore. He showed up on Sam Harris’ podcast, and on EconTalk, and on Joe Rogan and Art of Manliness and James Altucher. He wrote 12 Rules for Life: An Antidote to Chaos, a self-help listicle book inspired by Jesus, Nietzsche, Jung, and Dostoyevsky. [Let's see if I can tie all 12 rules to this essay] And he got rationalists talking about him, which I’ve done for several hours now. As a community, we haven’t quite figured out what to make of him.

Peterson is a social conservative, a Christian who reads truth in the Bible and claims that atheists don’t exist, and a man who sees existence at every level as a conflict between good and evil. The majority of the rationalist community (present company included) are socially liberal and trans-friendly, confident about our atheism, and mistake theorists who see bad equilibria more often than intentional malevolence.

But the most salient aspect of Peterson isn’t his conservatism, or his Christianity, or Manicheanism. It’s his commitment, above all else, to seek the truth and to speak it. [Rule 8: Tell the truth – or, at least, don’t lie] Rationalists can forgive a lot of an honest man, and Peterson shoots straighter than a laser gun.

Peterson loves to talk about heroic narratives, and his own life in the past few months reads like a movie plot, albeit more Kung Fu Panda than Passion of the Christ. Peterson spent decades assembling worldview that integrates everything from neurology to Deuteronomy, one that’s complex, self-consistent and close enough to the truth to withstand collision with reality. It’s also light-years and meta-levels away from the sort of simplistic frameworks offered by the mass media on either the right or the left.

When the C-16 controversy broke, said media assumed that Peterson would meekly play out the role of outgroup strawman, and were utterly steamrolled. A lot of the discussion about the linked interview has to do with rhetoric and argument, but to me, it showcased something else. A coherent worldview like that is a powerful and beautiful weapon in the hands of the person who is committed to it.

But it wasn’t the charismatic performances that convinced me of Peterson’s honesty, it’s clips like this one, where he was asked about gay marriage.

Most people are for or against gay marriage based on their object level feeling about gays, and their tribal affiliation. The blue tribe supports gay marriage, opposes first-cousin marriage, and thinks that the government should force a cake shop to bake a gay wedding cake because homophobia is bad. The red tribe merely flips the sign on all of those.

Some people go a meta-level up: I support gay marriage, support cousin marriage, and support bakers getting to decide themselves which cakes they bake for reasons of personal freedom [Rule 11: don’t bother children when they are skateboarding], and the ready availability of both genetic testing clinics and gay-friendly bakeries.

But to Peterson, everything is a super-meta-level tradeoff that has the power to send all of Western Civilization down the path to heaven or hell:

With regards to gay marriage specifically, that’s a really tough one for me. I can imagine… [long pause] I can’t do anything other than speak platitudes about it I suppose, unfortunately.
If the marital vows are taken seriously, then it seems to me it’s a means by which gay people can be integrated more thoroughly into standard society, and that’s probably a good thing. And maybe that would decrease promiscuity which is a public health problem, although obviously that’s not limited to gay people. Gay men tend to be more promiscuous than average, probably because there are no women to bind them with regards to their sexual activity. […]
I’m in favor of extending the bounds of traditional relationships to people who wouldn’t be involved in a traditional long-term relationship otherwise, but I’m concerned about the undermining of traditional modes of being including marriage [which has always been about] raising children in a stable and optimal environment.

Few people besides Peterson himself can even fully understand his argument, let alone endorse it. And yet he can’t help himself from actually trying to figure out what his worldview says about gay marriage, and from saying it with no reservations.

I think that Peterson overgeneralizes about gay men (and what about lesbians?), and he’s wrong about the impact of gay marriage on society on the object level. I’m also quite a fan of promiscuity, and I think it’s stupid to oppose a policy just because “neo-Marxists” support it.

But I don’t doubt Peterson’s integrity, which means that I could learn something from him. [Rule 9: assume that the person you are listening to might know something you don’t]. 

So, what can Jordan Peterson teach rationalists?

In 12 Rules, Peterson claims that eating a large, low-carb breakfast helps overcome depression and anxiety. Is this claim true?

There’s a technical sort of truth, and here “technical” is itself a synonym for “true”, that’s discoverable using the following hierarchy of methods: opinion -> observation -> case report -> experiment -> RCT -> meta-analysis -> Scott Alexander “much more than you wanted to know” article. If you ask Scott whether a low-carb breakfast reduces anxiety he’ll probably say that there isn’t a significant effect, and that’s the technical truth of the matter.

So why does Peterson believe the opposite? He’s statistically literate… for a psychologist. He references a couple of studies about the connection between insulin and stress, although I’d wager he wouldn’t lose much sleep if one of them failed to replicate. It probably also helps that Gary Tabes is really playing the part of the anti-establishment truth-crusader. Ultimately, Peterson is answering a different question: if a patient comes to your psychiatry clinic complaining about mild anxiety, should you tell them to eat bacon and eggs for breakfast? 

My rationalist steelman of Peterson would say something like this: maybe the patient has leaky gut syndrome that contributes to their anxiety, and reducing gluten intake would help. If not, maybe the link between insulin and cortisol will turn out to be real and meaningful. If not, maybe having a morning routine that requires a bit of effort (it’s harder to make eggs than eat a chocolate bar, but not too hard) will bring some needed structure to the patient’s life. If not, maybe getting any advice whatsoever from a serious looking psychologist would make the patient feel that they are being listened to, and that will placebo their anxiety by itself. And if not, then no harm was done and you can try something else.

But, Peterson would add, you can’t tell the patient all of that. You won’t help them by explaining leaky guts and p-values and placebo effects. They need to believe that their lives have fallen into chaos, and making breakfast is akin to slaying the dragon-goddess Tiamat and laying the foundation for stable order that creates heaven on Earth. This is metaphorical truth.

If you’re a rationalist, you probably prefer your truths not to be so… metaphorical. But it’s a silly sort of rationalist who gets sidetracked by arguments about definitions. If you don’t like using the same word to mean different things [Rule 10: be precise in your speech], you can say “useful” or “adaptive” or “meaningful” instead of “true”. It’s important to use words well, but it’s also important to eat a good breakfast. Probably. [Rule 2: treat yourself like you would someone you are responsible for helping]

One of the most underrated recent ideas in rationality is the idea of fake frameworks. I understand it thus: if you want to understand how lasers work, you should really use quantum physics as your framework. But if you want to understand how a cocktail party works, looking at quarks won’t get you far. You can use the Hansonian framework of signaling, or the sociological framework of class and status, or the psychometric framework of introverts and extroverts, etc.

All of those frameworks are fake in the sense that introvert isn’t a basic physical entity the same way an up quark is. Those frameworks are layers of interpretation that you impose on what you directly experience, which is human-shaped figures walking around, making noises with their mouths and sipping gin & tonics. You can’t avoid imposing interpretations, so you should gather a diverse toolbox of frameworks and use them consciously even you know they’re not 100% true.

Here’s a visual example:

Q: Which map is more true to the territory?

A: Neither. But if your goal is to meet Einstein on his way to work you use the one on the right, and if your goal is to count the trees on the golf course you use the one on the left.

By the way, there’s a decent chance that “fake frameworks” is what the post-rationalists have been trying to explain to me all along, except they were kind of rude about it. If it’s true that they had the same message, it took Valentine to get it through my skull because he’s an excellent teacher, and also someone I personally like. Likingshouldn’t matter to rationalists, but somehow it always seems to matter to humans. [Rule 5: do not let your children do anything that makes you dislike them]

That’s what Jordan Peterson is: a fake framework. He’s a mask you can put on, a mask that changes how you see the world and how you see yourself in the mirror. Putting on the Jordan Peterson mask adds two crucial elements that rationalists often struggle with: motivation and meaning.

The Secular Solstice is a celebration designed by rationalists to sing songs together and talk about meaning. [Rule 3: make friends with people who want the best for you] The first time I attended, the core theme was the story of Stonehenge. Once upon a time, humans lived in terror of the shortening of the days each autumn. But we built Stonehenge to mark the winter solstice and predict when spring would come – a first step towards conquering the cold and dark.

But how did Stonehenge get built?

First, the tribe had a Scott Alexander. Neolithic Scott listened to the shamans speak of the Sun God, and demanded to see their p-values. He counted patiently the days between the solstices of each year and drew arrows pointing to the exact direction the sun rose each day.

Finally, Scott spoke up:

Hey guys, I don’t think that the sun is a god who cares about dancing and goat sacrifice. I think it just moves around in a 365-day period, and when it rises from the the direction of that tree that’s when the days start getting longer again.

And the tribe told him that it’s all much more than they wanted to know about the sun.

But Scott only gets us halfway to Stonehenge. The monument itself was built over several centuries, using 25-ton rocks that were brought to the site from 140 miles away. The people who hauled the first rock had to realize (unless subject to extreme planning fallacy) that not a single person they know, nor their children or grandchildren, would see the monument completed. Yet these people hauled the rocks anyway, and that required neolithic Peterson to inspire them.

Peterson is very popular with the sort of young people who have been told all their lives to be happy and proud of just who they are. But when you’re 19, short on money, shorter on status, and you start to realize you won’t be a billionaire rock star, you don’t see a lot to be satisfied with. Lacking anything to be proud of individually, they are tempted to substitute their self for a broader group identity. What the identity groups mostly do is complain that the world is unfair to them; this keeps the movement going but doesn’t do much to alleviate the frustration of its members.

And then Peterson tells them to lift the heaviest rock they can and carry it. Will it ease their suffering? No. Everyone is suffering, but at least they can carve meaning out of that. And if enough people listen to that message century after century, we get Stonehenge. [Rule 7: pursue what is meaningful, not what is expedient]

A new expansion just came out for the Civilization 6 video game, and instead of playing it I’m nine hours into writing this post and barely halfway done. I hope I’m not the only one getting some meaning out of this thing.

It’s not easy to tell a story that inspires a whole tribe to move 25-ton rocks. Peterson noticed that the Bible is one story that has been doing that for a good while. Eliezer noticed it too, and he was not happy about it, so he wrote his own tribe-inspiring work of fiction. I’ve read both, cover to cover. And although I found HPMoR more fun to read, I end up quoting from the Old Testament a lot more often when I have a point to make.

“Back in the old days, saying that the local religion is a work of fiction would have gotten you burned at the stake“, Eliezer replies. Well, today quoting research on psychology gets you fired from Google, and quoting research on climate change gets you fired from the EPA. Eppur si muove.

Jews wrote down commentaries suggesting that the story of Jonah is metaphorical a millennium before Galileo was born, and yet they considered themselves the People of the Book. The Peterson mask reminds us that people don’t have to take a story literally to take it seriously.

Peterson loves to tell the story of Cain and Abel. Humans discovered sacrifice: you can give away something today to get something better tomorrow. “Tomorrow” needs a face, so we call it “God” and act out a literal sacrifice to God to hammer the point home for the kids.

But sometimes, the sacrifice doesn’t work out. You give, but you don’t get, and you are driven to resentment and rage against the system. That’s what Cain does, and the story tells us that it’s the wrong move – you should ponder instead how to make a better sacrifice next time.

When I was younger, I went to the gym twice a week for a whole year. After a year I didn’t look any sexier, I didn’t get much stronger, and I was sore a lot. So I said fuck it and stopped. Now I started going to the gym twice a week again, but I also started reading about food and exercise to finally get my sacrifice to do something. I still don’t look like someone who goes to the gym twice a week, but I can bench 20 pounds more than I could last year and I rarely get sore or injured working out. [Rule 4: compare yourself with who you were yesterday, not with who someone else is today]

Knowing that the story of Cain and Abel is made up hasn’t prevented it from inspiring me to exercise smarter.

There’s a problem: many stories that sound inspirational are full of shit. After listening to a few hours of Peterson talking about archetypes and dragons and Jesus, I wasn’t convinced that he’s not full of it either. You should only wear a mask if it leaves you wiser when you take it off and go back to facing your mundane problems.

What convinced me about Peterson is this snippet from his conversation with James Altucher (24 minutes in):

If you’re trying to help someone who’s in a rough situtation, let’s say with their relationship, you ask them to start watching themselves so that you can gather some information. Let’s take a look at your relationship for a week and all you have to do is figure out when it’s working and when it’s not working. Or, when it’s working horribly and when it’s working not too bad. Just keep track of that.
“Well, my wife ignores me at the dinner table,” or “My wife ignores me when I come home.” Then we start small. How would you like your wife to greet you when you come home?
“I’d like her to stop what she’s doing and come to the door.” Well, ask her under what conditions she would be willing to do that. And let her do it badly. Do it for a week, just agree that when either of you comes home you shut off the TV and ask “how was your day?” and listen for 10 seconds, and see how that goes.
Carl Jung said “modern people can’t see God because they won’t look low enough”. It means that people underestimate the importance of small things. They’re not small. How your wife says hi to you when you come home – that’s not small, because you come home all the time. You come home three times a day, so we can do the arithmetic.
Let’s say you spend 15 minutes a day coming home, something like that. And then it’s every day, so that’s 7 days a week, so that’s 105 minutes. Let’s call it 90 minutes a week. So that’s 6 hours a month, 72 hours a year. So you basically spend two full workweeks coming home, that’s about 3% of your life.
You spend about 3% of your life coming home. Fix it! Then, fix 30 more things.

Aside from the Jung quote, that’s the most Putanumonit piece of life advice I have ever heard on a podcast, complete with unnecessary arithmetic. If Peterson can put on a Putanumonit hat and come up with something that makes deep sense to me, perhaps I could do the same with a Peterson mask.

The rationalist project is about finding true answers to difficult questions. We have a formula that does that, and we’ve tracked many ways in which our minds can veer of the right path. But there are profound ways in which a person can be unready to seek the truth, ways that are hard to measure in a behavioral econ lab and assign a catchy moniker to.

I have written a lot about romance and dating in the last two years, including some mildly controversial takes. I could not have written any of them before I met my wife. Not because I didn’t know the facts or the game theory, but because I wasn’t emotionally ready. When I read private drafts I wrote about women from years ago, they are colored by frustration, guilt, exuberance or fear, all depending on the outcome of the last date I’ve been on. Those emotions aren’t exactly conducive to clarity of thought.

I think this was also the reason why Scott Aaronson wrote The Comment that led to Untitled only when he was married and had a child. Then, he could weather the resulting storm without backing down from his truth. It is hard to see something true about relationships when your own aren’t in order, let alone write something true. [Rule 6: set your house in perfect order before you criticize the world]

The flipside is: when you wear the Peterson mask, you are compelled to spread the word when you’ve found a path that leads somewhere true. There is no higher calling in Peterson’s worldview. The Kolmogorov Option becomes the Kolmogorov Mandate (and the Scott Alexander mask mostly agrees).

Let’s go back to the beginning: Peterson made noise by refusing to comply with a law that doesn’t actually do what he claims. How is that contributing to the truth?

For starters, I would have bet that Peterson was going to lose his job when the letters calling for his dismissal started rolling in, letters signed by hundreds of Peterson’s own colleagues at the University of Toronto. I would have bet wrong: the only thing that happened is that Peterson now makes several times his academic salary from his Patreon account (if you want me to start saying crazy things you can try clicking here).

This is critical: it created common public knowledge that if free speech is ever actually threatened by the government to the extent that Peterson claims, the support for free speech will be overwhelming even at universities. Speaking an unpopular truth is a coordination problem, you have to know that others will stand with you if you stand up first. [Rule 1: stand up straight with your shoulders back]

Now, more people know that there’s appetite in the West for people who stand up for truth. This isn’t a partisan thing, I hope that Peterson inspires people with inconvenient leftist opinions to speak up in red tribe-dominated spaces (e.g. the NFL protests).

Peterson was technically wrong, as he is on many things. But he sees the pursuit of truth as a heroic quest and he’s willing to toss some rocks around, and I think this helps the cause of truth even if one gets some technical details wrong.

Being wrong about the details is not good, but I think that rationalists are pretty decent at getting technicalities right. By using the Peterson Mask judiciously, we can achieve even more than that.

[Rule 12: pet a cat when you encounter one on the street], but don’t touch the hedgehog, they don’t like it.

New Comment
154 comments, sorted by Click to highlight new comments since: Today at 9:00 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I have a generally positive opinion of Peterson but I wouldn't be adding anything to the conversation by talking about why, you already covered his good points. Instead I'll talk about why I'm leery of dubbing him a Rationalist hero.

Peterson's entire first podcast with Sam Harris was an argument over Peterson's (ab)use of the word "truth". I'm not sure if anyone walked away from that experience entirely sure what Peterson means when he says "truth".

One can assume that he means something like "metaphorical truth", that some stories contain a kind of truth that is more like "usefulness as a map" than "accurate reflection of objective reality". Sam Harris' rejoinder was along the lines that using stories as ways of discovering meaning is all well and good, but believing those stories are true leads to horrifying failure modes.

For example, if you believe some particular bit of metaphysical narrative is true, you feel compelled act on contingent details of the story that are unrelated to the intended moral. Insert your own favorite minor bit of religious dogma that led to hundreds of years of death and... (read more)

6Evan_Gaensbauer6y
Not to bias anyone, but as anecdata a couple of my Christian friends have told me they find it difficult to understand Peterson's framing of his (relationship with) Christianity either. So that from the perspective on the other end that Peterson is trying at some sketchy epistemology could be telling. Maybe he's committing some kind of [golden mean fallacy[(https://en.wikipedia.org/wiki/Argumentto moderation) and advocating for widespread [cultural Christianity](https://en.wikipedia.org/wiki/Cultural_Christian).
4alkjash6y
His concept of truth seems to be the main beef rationalists have with Peterson, and something I've also struggled with for a while. I think this is partly solved with a healthy application of Rationalist Taboo - Peterson is a pragmatist, and AFAICT the word truth de-references as "that which it is useful to believe" for him. In practice although he adds a bunch of "metaphorical truths" under this umbrella, I have not seen him espouse any literal falsehoods as "useful to believe," so his definition is just a strict generalization of our usual notion of truth. Of course I'm not entirely happy with his use of the word, but if you assume as I do "it is useful to believe what is literally true" (i.e. usefulness = accuracy for maps) then his definition agrees on the literal level with your usual notion of truth. The question then is what he means by metaphorical truth, and in what sense this is (as he claims) a higher truth than literal truth. The answer is something like "metaphorical truth is extracted meta-stories from real human behavior that are more essential to the human experience than any given actual story they are extracted from." E.g. the story of Cain and Abel is more true to the human experience than the story of how you woke up, brushed your teeth, and went to work this morning. This is where the taboo needs to come in: what he means by more true is "it is more useful learn from Cain and Abel as a story about the human experience than it is to learn from your morning routine." I claim that this is a useful way to think about truth. For any given mythological story, I know it didn't actually happen, but I also know that whether or not it actually happened is irrelevant to my life. So the regular definition of truth "did it actually happen" is not the right generalization to this setting. Why should I believe useless things? What I really want to know is (a) what does this story imply about reality and how I should act in the world and (b) if I do act tha

I think it's pretty risk to play Rationalist taboo with what other people are saying. It's supposed to be a technique for clarifying an argument by removing a word from the discussion, preventing it from being solely an argument about definitions. I would like it if Peterson would taboo the word "truth", yeah.

I also don't think that dereferencing the pointer actually helps. I object to how he uses "truth", and I also object to the idea that Harry Potter is (dereferenced pointer)->[more psychologically useful to believe and to use as a map than discoveries about reality arrived at via empiricism]. It's uh ... it's just not. Very much not. Dangerous to believe that it is, even. Equally if not more dangerous to believe that Christianity is [more psychologically useful to believe and to use as a map than discoveries about reality arrived at via empiricism]. I might sign on to something like, certain stories from Christianity are [a productive narrative lens to try on in an effort to understand general principles of psychology, maybe, sometimes].

The claim is that if believing a story predictably makes your life better, then you should ove
... (read more)
6habryka6y
I do wonder whether you would change your mind after checking the links by Gaius Leviathan IX in a comment below. A lot of those did strike me as “literal falsehoods”, and seem to go against the things you outlined here.
6alkjash6y
I have previously noticed (having watched a good hundred hours of Peterson's lectures) all of these things and these seem to me to be either straight-up misinterpretation on the part of the listener (taboo your words!) or the tiny number of inevitable false positives that comes out of Peterson operating his own nonstandard cognitive strategy, which is basically UNSONG Kabbalah. This overall argument reminds me of the kind of student who protests that "i isn't actually a number" or "a step function doesn't actually have a derivative."
4cousin_it6y
Yeah, the benefits of literal truth are more altruistic and long-term.
3TurnTrout6y
This is what worries me. I frankly haven't looked into Peterson too closely, but what I've heard hasn't impressed me. I found some of his quotes in OP's piece to be quite insightful, but I don't understand why he's spoken of so glowingly when he apparently espouses theist beliefs regularly. Warning signs of halo effect?
6Viliam6y
Speaking for myself, I do not agree with some Peterson's opinions, but I like him as a person. Probably the best way to explain it is that he is the kind of "politically controversial" person which I wouldn't be scared to hypothetically find out that he is actually living next door to me. I find the way he uses the word "truth" really annoying. Yet, if I told him, I don't expect him to... yell some abuse at me, hit me with a bike lock, or try to get me fired from my job... just to give a few examples of recent instruments of political discourse. He would probably smile in a good mood, and then we could change the topic. Peterson is definitely not a rationalist, but there is something very... psychologically healthy... about him. It's like when you are in a room full of farts, already getting more or less used to it, and then suddenly someone opens the window and lets the fresh air in. He can have strong opinions without being nasty as a person. What a welcome change, just when it seemed to me that the political discourse online is dominated by, uhm, a combination of assholes and insane people (and I am not channeling Korzybski now; I am using the word in its old-fashioned sense). I'd like to somehow combine the rationality of LessWrong with the personality of Peterson. To become a rational lobster, kind of. Smart, strong, and a nice neighbor. EDIT: I guess I wanted to say that I am not concerned with Peterson's lack of x-rationality -- but neither I deny it -- because I do not intend to use him as an example of x-rationality. In many aspects he talks nonsense (although probably not more than an average person). But he has other strengths I want to copy. I see Peterson as a valid and valuable member of the "niceness and civilization" tribe, if there is such a thing. As opposed to e.g. people who happen to share my disrespect of religion and mysticism, but personality-wise are just despicable little Nazis, and I definitely wouldn't want them as neighbors.
6Evan_Gaensbauer6y
I think: 1. compartmentalization by theists makes it so they're apparently as rational or competent in thought on a lot of topics as anyone else, despite disagreements regarding religion and theism; 2. bias in all its forms is so ubiquitous outside of a domain of beliefs related to skepticism or religion, non-theists often don't make for more rational conversation partners than theists; 3. (this might be more unique to me, but) theists often have a better map of abstracts parts of the territory than non-theists. An example of (3) is how seeking conflict resolution through peaceful and truth-seeking deliberation rather than through tribalism and force. I've observed Christians I know are likelier to stay politically moderate as politics has become more polarized the last couple years. Something about loving your neighbour and the universality of human souls being redeemable or whatever results in Christians opting for mistake theory over conflict theory than non-religious folk I know. In a roundabout way, some theists have reached the same conclusions regarding how to have a rational dialogue as LessWrongers. All this combined has made it so myself and a few friends in the rationality community have become less worried about theism among someone's beliefs than in the past. This is only true of a small number of religious people I tend to hang out with, which is a small sample, and my social exposure has pretty much always been set up to be to moderates, as opposed to a predominantly left-wing or right-wing crowd. If other rationalists share this attitude, this could be the reason for increased tolerance for prominent theism in rationalist discourse besides the halo effect. Admittedly, even if being bullish about theists contributing to social epistemology isn't due to the halo effect, ultimately it's something that looks like a matter of social convenience, rather than a strategy optimized for truth-seeking. (Caveat: please nobody abruptly start optimizing
5TurnTrout6y
I’d like to make clear that the claim i am making is more with respect to the assertions that Peterson is someone who has exemplary rationality, when that is clearly not the case. Rejecting religion is a sign that one is able to pass other epistemic hurdles. I used to be religious, I seriously thought about it because of the Sequences, and then I deconverted - that was that. I looked at it as the preschool entrance exam for tougher problems, so I took it quite seriously. Also, I would never claim that theists are worse people in a moral sense. What is important to me, however, is that epistemic rigour in our community not be replaced by comforting rationalizations. I don’t know if that’s what’s happening here, but I have my suspicions.
3Evan_Gaensbauer6y
Upvoted. Thanks for the clarifications. It seems you're not talking about the mere presence of theists in the rationality community at all, but rather in spite of his theism, being at least poorly articulated, and everything else specious in his views, I agree it's indeed alarming JBP might be seen as an exemplar of rationality. It's my impression it is still a minority of community members who agree JBP is impressive. I've currently no more thoughts on how significant that minority is, or what its portents for the rest of rationality might be.
0ChristianKl6y
When it comes to epistemic rigour, you show in your post that you clearly have a strong personal motivation for believing that rejecting religion is a good sign to pass other epistemic hurdles but at the same you don't provide any good evidence for the claim. The priors for taken a particular single characteristic that's tribal in nature like religious beliefs as a high information for whether or not a person is rational aren't good.
1TurnTrout6y
Crisis of Faith
-1vedrfolnir6y
I wouldn't use rejection of religion as a signal -- my guess is that most people who become atheists do so for social reasons. Church is boring, or upper-middle-class circles don't take too kindly to religiosity, or whatever. And is our community about epistemic rigor, or is it about instrumental rationality? If, as they say, rationality is about winning, the real test of rationality is whether you can, after rejecting Christianity, unreject it.
4TurnTrout6y
Have you read the sequences? I don’t mean this disrespectfully, but this issue is covered extremely thoroughly early on. If you want to win, your map has to be right. If you want to be able to make meaningful scientific discoveries, your map has to be right. If you hold on to beliefs that aren’t true, your map won’t be right In many areas.
5vedrfolnir6y
Have you been following the arguments about the Sequences? This issue has been covered fairly thoroughly over the last few years. The problem, of course, is that the Sequences have been compiled in one place and heavily advertised as The Core of Rationality, whereas the arguments people have been having about the contents of the Sequences, the developments on top of their contents, the additions to the conceptual splinter canons that spun off of LW in the diaspora period, and so on aren't terribly legible. So the null hypothesis is the contents of the Sequences, and until the contents of the years of argumentation that have gone on since the Sequences were posted are written up into new sequences, it's necessary to continually try to come up with ad-hoc restatements of them -- which is not a terribly heartening prospect. Of course, the interpretations of the sacred texts will change over the years, even as the texts themselves remain the same. So: why does it matter if the map isn't right in many areas? Is there a general factor of correctness, such that a map that's wrong in one area can't be trusted anywhere? Will benefits gained from errors in the map be more than balanced out by losses caused by the same errors? Or is it impossible to benefit from errors in the map at all?
1TurnTrout6y
No, I’m fairly new. Thanks for the background. What would the benefits be of "unrejecting" Christianity, and what would that entail? I’d like to understand your last point a little better.
6vedrfolnir6y
A correct epistemological process is likely to assign very low likelihood to the proposition of Christianity being true at some point. Even if Christianity is true, most Christians don't have good epistemics behind their Christianity; so if there exists an epistemically justifiable argument for 'being a Christian', our hypothetical cradle-Christian rationalist is likely to reach the necessary epistemic skill level to see through the Christian apologetics he's inherited before he discovers it. At which point he starts sleeping in on Sundays; loses the social capital he's accumulated through church; has a much harder time fitting in with Christian social groups; and cascades updates in ways that are, given the social realities of the United States and similar countries, likely to draw him toward other movements and behavior patterns, some of which are even more harmful than most denominations of Christianity, and away from the anthropological accumulations that correlate with Christianity, some of which may be harmful but some of which may be protecting against harms that aren't obvious even to those with good epistemics. Oops! Is our rationalist winning? To illustrate the general class of problem, let's say you're a space businessman, and your company is making a hundred space shekels every metric tick, and spending eighty space shekels every metric tick. You decide you want to make your company more profitable, and figure out that a good lower-order goal would be to increase its cash incoming. You implement a new plan, and within a few megaticks, your company is making four hundred space shekels every metric tick, and spending a thousand. Oops! You've increased your business's cash incoming, but you've optimized for too low-order a goal, and now your business isn't profitable anymore. Now, as you've correctly pointed out, epistemic rationality is important because it's important for instrumental rationality. But the thing we're interested in is instrumental ratio
But the thing we're interested in is instrumental rationality, not epistemic rationality.

Ironically, this sentence is epistemically true but instrumentally very dangerous.

See, to accurately assess which parts of epistemic rationality one should sacrifice for instrumental improvements requires a whole lot of epistemic rationality. And once you've made that sacrifice and lost some epistemic rationality, your capacity to make such trade-offs wisely in the future is severely impaired. But if you just focus on epistemic rationality, you can get quite a lot of winning as a side effect.

To bring it back to our example: it's very dangerous to convince yourself that Jesus died for your sins just because you notice Christians have more friends. To do so you need to understand why believing in Jesus correlates with having friends. If you have a strong enough understanding of friendship and social structures for that, you can easily make friends and build a community without Jesus.

But if you install Jesus on your system you're now left vulnerable to a lot of instrumentally bad things, with no guarantee that you'll actually get the friends and community you wanted.

4vedrfolnir6y
Assuming that the instrumental utility of religion can be separated from the religious parts is an old misconception. If all you need is a bit of sociological knowledge, shouldn't it be possible to just engineer a cult of reason? Well, as it turns out, people have been trying for centuries, and it's never really stuck. For one thing, there are, in startup terms, network effects. I'm not saying you should think of St. Paul as the Zuckerberg of Rome, but I've been to one of those churches where they dropped all the wacky supernatural stuff and I'd rather go to a meetup for GNU Social power users. For another thing, it's interesting that Eliezer Yudkowsky, who seems to be primarily interested in intellectual matters that relate to entities that are, while constrained by the rules of the universe, effectively all-knowing and all-powerful, and who cultivated interest in the mundane stuff out of the desire to get more people interested in said intellectual matters, seems to have gotten unusually far with the cult-of-reason project, at least so far. Of course, if we think of LW as the seed of what could become a new religion (or at least a new philosophical scene, as world-spanning empires sometimes generate when they're coming off a golden age -- and didn't Socrates have a thing or two to say about raising the sanity waterline?), this discussion would have to look a lot different, and ideally would be carried out in a smoke-filled room somewhere. You don't want everyone in your society believing whatever nonsense will help them out with their social climbing, for reasons which I hope are obvious. (On the other hand, if we think of LW as the seed of what could become a new religion, its unusual antipathy to other religions -- I haven't seen anyone deploy the murder-Gandhi argument to explain why people shouldn't do drugs or make tulpas -- is an indisputable adaptive necessity. So there's that.) If, on the other hand, we think of LW as some people who are interested in i
4ozymandias6y
The murder-Gandhi argument against drugs is so common it has a name, "addiction." Rationalists appear to me to have a perfectly rational level of concern about addiction (which means being less concerned about certain drugs, such as MDMA, and more concerned about other drugs, such as alcohol). I am puzzled about how making tulpas could interfere with one's ability to decide not to make any more tulpas.
0vedrfolnir6y
The only explanation I caught wind of for the parking lot incident was that it had something to do with tulpamancy gone wrong. And I recall SSC attributing irreversible mental effects to hallucinogens and noting that a lot of the early proponents of hallucinogens ended up somewhat wacky. But maybe it really does all work out such that the sorts of things that are popular in upper-middle-class urban twenty-something circles just aren't anything to worry about, and the sorts of things that are unpopular in them (or worse, popular elsewhere) just are. What a coincidence!
4Jacob Falkovich6y
Is your goal to have a small community of friends or to take over the world? The tightest-knit religions are the smaller and weirder ones, so if you want stronger social bonds you should join Scientology and not the Catholic church. Or, you know, you can just go to a LessWrong meetup. I've been to one yesterday: we had cake, and wine, and we did a double crux discussion about rationality and self-improvement. I dare say that we're getting at least half as much community benefit as the average church-goer, all for a modest investment of effort and without sacrificing our sanity. If someone doesn't have a social life because don't leave their house, they should leave their house. The religious shut-ins who read the Bible for fun aren't getting much social benefit either. Rationality is a bad religion, but if you understand religions well enough you probably don't need one.
5Viliam6y
One day I will have to write a longer text about this, but shortly: it is a false dilemma to see "small and tight-knit community" and "taking over the world" as mutually exclusive. Catholic church is not a small community, but it contains many small communities. It is an "eukaryotic" community, containing both the tight-knit subgroups and the masses of lukewarm believers, which together contribute to its long-term survival. I would like to see the rationalist community to become "eukaryotic" in a similar way. In certain ways it already happens: we have people who work at MIRI and CFAR, we have people who participate at local meetups, we have people who debate online. This diversity is strength, not weakness: if you only have one mode of participation, then people who are unable to participate in that one specific way, are lost to the community. The tricky part is keeping it all together. Preventing the tight-knit groups from excommunicating everyone else as "not real members", but also preventing the lukewarm members from making it all about social interaction and abandoning the original purpose, because both of those are natural human tendencies.
1vedrfolnir6y
One thing I'd like to see is more research into the effects of... if not secret societies, then at least societies of some sort. For example, is it just a coincidence that Thiel and Musk, arguably the two most interesting public figures in the tech scene, are both Paypal Mafia? Another good example is the Junto.
2Viliam6y
I imagine this could be tricky to research even if people wouldn't try to obfuscate the reality (which they of course will). It would be difficult to distinguish "these two people conspired together" from "they are two extremely smart people, living in the same city, of course they are likely to have met each other". For example, in a small country with maybe five elite high schools, elite people of the same age have high probability to have been high-school classmates. If they later take over the world together, it would make a good story to claim that they already conspired to do that during the high school. Even if the real idea only came 20 years later, no one would believe it after some journalist finds out that actually they are former classmates. So the information is likely to be skewed in both ways: not seeing connections where they are, and seeing meaningful connections in mere coincidences.
1vedrfolnir6y
Small groups have a bigger problem: they won't be very well documented. As far as I know, the only major source on the Junto is Ben Franklin's autobiography, which I've already read. Large groups, of course, have an entirely different problem: if they get an appreciable amount of power, conspiracy theorists will probably find out, and put out reams of garbage on them. I haven't started trying to look into the history of the Freemasons yet because I'm not sure about the difficulty of telling garbage from useful history.
5TurnTrout6y
That makes more sense. Broadly, I agree with Jacobian here, but there are a few points I'd like to add. First, it seems to me that there aren't many situations in which this is actually the case. If you treat people decently (regardless of their religion or lack thereof), you are unlikely to lose friends for being atheist (especially if you don't talk about it). Sure, don't be a jerk and inappropriately impose your views on others, and don't break it to your fundamentalist parents that you think religion is a sham. But situations where it would be instrumentally rational to believe falsely important things, the situations in which there really would be an expected net benefit even after factoring in the knock-on effects of making your epistemological slope just that bit more slippery, these situations seem constrained to "there's an ASI who will torture me forever if I don't consistently system-2 convince myself that god exists". At worst, if you really can't find other ways of socializing, keep going to church while internally keeping an accurate epistemology. Second, I think you're underestimating how quickly beliefs can grow their roots. For example, after reading Nate's Dark Arts of Rationality, I made a carefully-weighed decision to adopt certain beliefs on a local level, even though I don't believe them globally: "I can understand literally anything if I put my mind to it for enough time", "I work twice as well while wearing shoes", "I work twice as well while not wearing shoes" (the internal dialogue for adopting this one was pretty amusing), etc. After creating the local "shoe" belief and intensely locally-believing it, I zoomed out and focused on labelling it as globally-false. I was met with harsh resistance from thoughts already springing up to rationalize why my shoes actually could make me work harder. I had only believed this ridiculous thing for a few seconds, and my subconscious was already rushing to its defense. For this reason, I decided against
3habryka6y
It should be noted that there are practically-secular jewish communities that seem to get a lot of the benefit of religion, without actually believing in supernatural things. I haven't visited one of those myself, but friends who looked into it seemed to think they were doing pretty well on the epistemics front. So for people interested in religion, but not interested in the supernatural-believing stuff: Maybe joining a secular jewish community would be a good idea?
1vedrfolnir6y
That does seem to be a popular option for people around here who have the right matrilineage for it.
4TAG6y
It has to be correct and useful, and correctness only matters for winning inasmuch as it entails usefulness. Having a lot of correct information about golf is no good if you want to be a great chef.
3TurnTrout6y
Having correct object-level information and having a correct epistemological process and belief system are two different things. An incorrect epistemological process is likely to reject information it doesn’t like.
2TAG6y
And having correct and relevant object-level information is a third thing.
2vedrfolnir6y
Right, that's a possible response: the sacrifice of epistemic rationality for instrumental rationality can't be isolated. If your epistemic process leads to beneficial incorrect conclusions in one area, your epistemic process is broadly incorrect, and will necessarily lead to harmful incorrect conclusions elsewhere. But people seem to be pretty good at compartmentalizing. Robert Aumann is an Orthodox Jew. (Which is the shoal that some early statements of the general-factor-of-correctness position broke on, IIRC.) And there are plenty of very instrumentally rational Christians in the world. On the other hand, maybe people who've been exposed to all this epistemic talk won't be so willing to compartmentalize -- or at least to compartmentalize the sorts of things early LW used as examples of flaws in reasoning.
2TAG6y
Which is why you shouldn't have written "necessarily".
3vedrfolnir6y
I'm not sure how to square "rejecting religion is the preschool entrance exam of rationality" with "people are pretty good at compartmentalizing". Certainly there are parts of the Sequences that imply the insignificance of compartmentalization. I personally recall, maybe seven years ago, having to break the news to someone that Aumann is an Orthodox Jew. This was a big deal at the time! We tend to forget how different the rationalist consensus is from the contents of the Sequences. Every once in a while someone asks me or someone I know about what "postrationality" is, and they're never happy with the answer -- "isn't that just rationality?" Sure, to an extent; but to the extent that it is, it's because "postrationality" won. And to tie this into the discussion elsewhere in the thread: postrationality mostly came out of a now-defunct secret society.
1TurnTrout6y
Your line of reasoning re: Aumann feels akin to "X billionaire dropped out of high school / college, ergo you can drop out, too". Sure, perhaps some can get by with shoddy beliefs, but why disadvantage yourself? Point of clarification: are you claiming that rejecting religion provides no information about someone's rationality, or that it provides insignificant information? If postrationality really did win, I don't know that it should have. I haven't been convinced that knowingly holding false beliefs is instrumentally rational in any realistic world, as I outlined below.
1vedrfolnir6y
If people are pretty good at compartmentalization, it's at least not immediately clear that there's a disadvantage here. It's also not immediately clear that there's a general factor of correctness, or, if there is, what the correctness distribution looks like. It's at least defensible position that there is a general factor of correctness, but that it isn't useful, because it's just an artifact of most people being pretty dumb, and there's no general factor within the set of people who aren't just pretty dumb. I do think there's a general factor of not being pretty dumb, but I'm not sure about a general factor of correctness beyond that. It seems probable that "ignore the people who are obviously pretty dumb" is a novel and worthwhile message for some people, but not for others. I grew up in a bubble where everyone already knew to do that, so it's not for me, but maybe there are people who draw utility from being informed that they don't have to take seriously genuine believers in astrology or homeopathy or whatever. In a purely statistical sense, rejecting religion almost certainly provides information about someone's rationality, because things tend to provide information about other things. Technically, demographics provide information about someone's rationality. But not information that's useful for updating about specific people. Religious affiliation is a useful source of information about domain-specific rationality in areas that don't lend themselves well to compartmentalization. There was a time when it made sense to discount the opinions of Mormon archaeologists about the New World, although now that they've been through some time of their religiously-motivated claims totally failing to pan out it probably lends itself to compartmentalization alright. On the other hand, I wouldn't discount the opinions of Mormon historical linguists about Proto-Uralic. But I would discount the opinions of astrologers about Proto-Uralic, unless they have other histo
9Jacob Falkovich6y
I know that postrationality can't be distilled to a single sentence and I'm picking on it a bit unfairly, but "post"-rationality can't differentiate itself from rationality on that. Eliezer wrote about system 1 and system 2 in 2006: And it's not like this statement was ever controversial on LW. You can't get any more "core LW rationality" than the fricking Sequences. If someone thinks that rationality is about forcing everything into System 2 then, well, they should reread the fricking Sequences.
2Eli Tyre4y
Minor: but I appreciate you using the word “fricking”, instead of the obvious alternative. For me, it feels like it gets the emphaticness across just as well, without the crudeness.
1dsatan6y
It's hard to get Peterson second-hand. I recommend actually watching some of his lectures
1dsatan6y
While Peterson is a bit sloppy when he talks about truth, the notion of truth that he is working with is not simply his own construction to write some bottom line. There is a lot of literature of pragmatist analyses of truth and belief that roughly align with what he is saying and I would consider closer to what is the nature of truth (truer about truth) than the correspondence theory of truth presented in the sequences. I recommend Peirce's Making our Ideas Clear, Putnam's Corresponding with Reality, and James's The Will to Believe. Peirce and James can easily be found free online by searching and I can PM you Putnam if you want it.
Putting on the Jordan Peterson mask adds two crucial elements that rationalists often struggle with: motivation and meaning.

Holy shit, yes, thank you, this is exactly what has been motivating all of my contributions to LW 2.0. What is even the point of strengthening your epistemics if you aren't going to then use those strong epistemics to actually do something?

I first read the Sequences 6 years ago, and since then what little world-saving-relevant effort I've put in has been entirely other people asking me to join in on their projects. The time I spent doing that (at SPARC, at CFAR workshops, at MIRI workshops) was great; an embarrassing amount of the time I spent not doing that (really, truly embarrassing; amounts of time only a math grad student could afford to spend) was wasted on various forms of escapism (random internet browsing, TV, video games, anime, manga), because I was sad and lonely and put a lot of effort into avoiding having to deal with that (including avoiding debugging it at CFAR workshops because it was too painful to think about). At almost no point did I have the motivation to start a project on my own, and I didn't.

I've been working inte... (read more)

Rationalists who are epistemically strong are very lucky: you can use that strength in a place where it will actually help you, like investigating mysticism, by defending you from making the common epistemic mistakes there.

This is an interesting idea, but how does someone tell whether they're strong enough to avoid making the common epistemic mistakes when investigating mysticism? For example, if I practice meditation I might eventually start experiencing what Buddists call vipassana ("insight into the true nature of reality"). I don't know if I'd be able to avoid treating those experiences as some sort of direct metaphysical knowledge as most people apparently do, as opposed to just qualia generated by my brain while it's operating differently from normal (e.g., while in a state of transient hypofrontality).

There's probably a number of distinct epistemic risks surrounding mysticism. Bad social dynamics in the face of asymmetric information might be another one. (Access to mystical experiences is hard to verify by third parties but tempting to claim as a marker of social status.) I don't know how someone or some community could be confident that they wouldn't fall prey to one of these risks.

Good question. You can test your ability to avoid mistaking strong emotions for strong beliefs in general. For example, when you get very angry at someone, do you reflexively believe that they're a terrible person? When you get very sad, do you reflexively believe that everything is terrible? When you fall in love with someone, do you reflexively believe that they have only good qualities and no bad qualities? Etc.

I know I keep saying this, but it keeps being true: for me a lot of my ability to do this, and/or my trust in my ability to do this, came from circling, and specifically repeatedly practicing the skill of distinguishing my strong emotional reactions to what was happening in a circle from my best hypotheses about what was happening.

I don't know how someone or some community could be confident that they wouldn't fall prey to one of these risks.

I can't tell people what level of confidence they should want before trying this sort of thing, but I decided based on my instincts that the risk for me personally was low enough relative to the possible benefits that I was going to go for it, and things have been fine as far as my outside view is concerned so fa... (read more)

You can test your ability to mistake strong emotions for strong beliefs in general.

How much of this ability is needed in order to avoid taking strong mystical experiences at face value?

I can’t tell people what level of confidence they should want before trying this sort of thing, but I decided based on my instincts that the risk for me personally was low enough relative to the possible benefits that I was going to go for it,

In the comment I was replying to, you were saying that some rationalists are being too risk-averse. It seems like you're now backing off a bit and just talking about yourself?

and things have been fine so far, e.g. my belief in physics as standardly understood has not decreased at all.

I'm worried that the epistemic risks get stronger the further you go down this path. Have you had any mystical experiences similar to vipassana yet? If not, your continuing belief in physics as standardly understood does not seem to address my worry.

Also, to some extent I feel like this argument proves too much. There are epistemic risks associated to e.g. watching well-made TV shows or movies, or reading persuasive writing, and rationalists take on these epistemic r

... (read more)

I've only taken a few steps down the path that Qiaochu is following, but I have a few thoughts regarding epistemic risk-management:

  1. If you're ever going to investigate any altered-consciousness experiences at all, you're going to have to take a risk. You can never be 100% sure that something is "epistemically safe": certainty is impossible and time is limited.
  2. There is clearly an efficient frontier of risk/reward tradeoffs. I'm also a fan of circling, which doesn't ask you to accept any supernatural claims or dogmas and is incredibly useful for understanding the landscape of human minds. A few circling sessions with seriously strange people can do a lot to cure one of typical mind fallacy. On the other hand, joining Scientology the same week you start experimenting with ayahuasca is probably unwise.
  3. As a community, we can reduce risk by diversifying. Some of us will do LSD, some will do vipassana, some will circle, some will listen to 100 hours of Peterson... We should be able to notice if any particular subgroup are losing their minds. The real danger would occur if all of us suddenly started doing the same thing with no precautions.
[-]gjm6y410

What would it look like, if we noticed that a particular subgroup was beginning to lose its mind? I think it might look like a few unusually-rude people calling into question the alleged experiences of that particular subgroup and asking pointed questions about exactly what had happened to them and exactly why they thought themselves better off for it; and like the members of that particular subgroup responding with a combination of indignation and obfuscation: "we've definitely been changed for the better, but of course we can't expect you to understand what it's like if it hasn't happened to you, so why do you keep pushing us for details you know we won't be able to give?" / "I find it very discouraging to get this sort of response, and if it keeps happening I'm going to leave"; and like some of the more community-minded folks objecting to the rudeness of the questioners, observing acerbically that it always seems to be the same people asking those rude questions and wondering whether the emperor really has any clothes, and maybe even threatening to hand out bans.

All of which sounds kinda familiar.

I don't actually think that ... (read more)

gjm, point well taken. I wonder if it would be easier for people inside or outside Berkeley to spot if anyone there is seriously going off the rails and say something about it.

Anyway, I do want to elaborate a little bit on my "Efficient Frontier" idea. If anyone can build a map of which "mystical experiences" are safe/dangerous/worthwhile/useless and for whom, it should be people like us. I think it's a worthwhile project and it has to be done communally, given how different each person's experience may be and how hard it is to generalize.

The main example here is Sam Harris, a hardcore epistemic rationalist who has also spent a lot of time exploring "altered states of consciousness". He wrote a book about meditation, endorses psychedelics with caveats, is extremely hostile to any and all religions, and probably thinks that Peterson is kinda crazy after arguing with him for four hours. Those are good data points, but we need 20 more Sam Harrises. I'm hoping that LW can be the platform for them.

Perhaps we need to establish some norms for talking about "mystical experiences", fake frameworks, altered consciousness etc. so that people feel safe both talking and listening.

3TAG6y
There's Daniel Ingram, Vincent Horn, Kenneth Folk and the other Buddhist geeks.
8Qiaochu_Yuan6y
I was triggered by this initially, but I reread it and you're making a completely reasonable point. I notice I'm still concerned about the possibility that your reasonable point / motte will be distorted into a less reasonable point / bailey. That is not what I said. What I said is that if the pushback I've been getting becomes the default on LW 2.0, then I'm going to leave. This is a matter of people deciding what kind of place they want LW 2.0 to be. If they decide that LW 2.0 does not want to be the place for the things I want to talk about, then I'm going to respect that and talk about those things somewhere else. Staying would be unpleasant for everyone involved. I concede the point. We can try asking what kinds of externally verifiable evidence would distinguish this world from a world in which people like Val and I have been talking about real things which we lack the skill to explain (in a way satisfying to skeptical rationalists) via text. One prediction I'm willing to make is that I'm now more capable of debugging a certain class of thorny emotional bugs, so e.g. I'm willing to predict that over the next few years I'll help people debug such bugs at CFAR workshops and wherever else, and that those people will at least in expectation be happier, more productive, more willing to work on x-risk or whatever they actually want to do instead, less likely to burn out, etc. (But, in the interest of trying to be even-handed about possible hypotheses that explain the current state of public evidence, it's hard to distinguish the above world from a world in which people like Val and I are losing our minds and also becoming more charismatic / better at manipulation.)

I think that perhaps what bothers a lot of rationalists about your (or Valentine's) assertions is down to three factors:

  1. You don't tend to make specific claims or predictions. I think you would come off better - certainly to me and I suspect to others - if you were to preregister hypotheses more, like you did in the above comment. I believe that you could and should be more specific, perhaps stating that over a six month period you expect to work n more hours without burning out or that a consensus of reports from outsiders about your mental well-being will show a marked positive change during a particular time period that the evaluators did not know was special. While these would obviously not constitute strong evidence, a willingness to informally test your ideas would at least signal honest belief.
  2. You seem to make little to no attempt to actually communicate your ideas in words, or even define your concepts in words. Frankly, it continues to strike me as suspicious that you claim difficulty in even analogizing or approximating your ideas verbally. Even something as weak as the rubber-sheet analogy for General Relativity would - once again - signal an honest attempt.
  3. The
... (read more)
You don't tend to make specific claims or predictions. I think you would come off better - certainly to me and I suspect to others - if you were to preregister hypotheses more, like you did in the above comment. I believe that you could and should be more specific, perhaps stating that over a six month period you expect to work n more hours without burning out or that a consensus of reports from outsiders about your mental well-being will show a marked positive change during a particular time period that the evaluators did not know was special.

I have several different responses to this which I guess I'll also number.

  1. Sure, fine, I'm willing to claim this. Everyone who has interacted with me both in the last month and, say, a year ago will tell you that I am visibly happier and doing more of what I actually want ("productive" can be a loaded term). People can ask Anna, Duncan, Lauren, etc. if they really want. I can also self-report that I've engaged in much less escapism (TV, movies, video games, etc.) this month than in most months of the last 5 years, and what little I have engaged in was mostly social.
  2. I would love to be having this conversation; if
... (read more)
9Evan Clark6y
(This is my second comment on this site, so it is probable that the formatting will come out gross. I am operating on the assumption that it is similar to Reddit, given Markdown) 1. To be as succinct as possible, fair enough. 2. I want to have this conversation too! I was trying to express what I believe to be the origins of people's frustrations with you, not to try to discourage you. Although I can understand how I failed to communicate that. 3. I am going to wrap this up with the part of your reply that concerns experiential distance and respond to both. I suspect that a lot of fear of epistemic contamination comes from the emphasis on personal experience. Personal (meatspace) experiences, especially in groups, can trigger floods of emotions and feelings of insights without those first being fed through rational processing. Therefore it seems reasonable to be suspicious of anyone who claims to teach through personal experience. That being said, the experimental spirit suggests the following course of action: get a small group and try to close their experiential gap gradually, while having them extensively document anything they encounter on the way, then publish that for peer analysis and digestion. Of course that relies on more energy and time than you might have. On a general level, I totally concede that I am operating from relatively weak ground. It has been a while - or at least felt like a while - since I read any of the posts I mentioned (tacitly or otherwise) with the exception of Kensho, so that is definitely coloring my vision. I acknowledge that many people are responding to your ideas with unwarranted hostility and forcing you onto the defensive in a way that I know must be draining. So I apologize for essentially doing that in my original reply to you. I think that I, personally, am unacceptably biased against a lot of ideas due to their "flavor" so to speak, rather than their actual strength. As to consistency, I actually do want to hold yo
I suspect that a lot of fear of epistemic contamination comes from the emphasis on personal experience. Personal (meatspace) experiences, especially in groups, can trigger floods of emotions and feelings of insights without those first being fed through rational processing.

I recognize the concern here, but you can just have the System 1 experience and then do the System 2 processing afterwards (which could be seconds afterwards). It's really not that hard. I believe that most rationalists can handle it, and I certainly believe that I can handle it. I'm also willing to respect the boundaries of people who don't think they can handle it. What I don't want is for those people to typical mind themselves into assuming that because they can't handle it, no one else can either, and so the only people willing to try must be being epistemically reckless.

Therefore it seems reasonable to be suspicious of anyone who claims to teach through personal experience.

There are plenty of completely mundane skills that can basically only be taught in this way. Imagine trying to teach someone how to play basketball using only text, etc. There's no substitute for personal ex... (read more)

5Evan Clark6y
It is probably true that most rationalists could handle it. It is also probably true, however, that people who can't handle it could end up profoundly worse for the experience. I am not sure we should endorse potential epistemic hazards with so little certainty about both costs and benefits. I also grant that anything is a potential epistemic hazard and that reasoning under uncertainty is kind of why we bother with this site in the first place. This is all to say that I would like to see more evidence of this calculation being done at all, and that if I was not so geographically separated from the LWsphere, I would like to try these experiences myself. I am not sure that it should be the prior for mental skills however. As you pointed out, scholastic skills are almost exclusively (and almost definitionally) attainable through text. I know that I can and have learned math, history, languages, etc., through reading, and it seems like that is the correct category for Looking, etc., as well (unless I am mistaken about the basic nature of Looking, which is certainly possible). This is a sad circumstance, I wish it were otherwise, and I understand why you have made the choice you have considering the (rather ironically) immediate and visceral response you are used to receiving.
8Qiaochu_Yuan6y
I'm not sure what "endorse" means here. My position is certainly not "everyone should definitely do [circling, meditation, etc.]"; mostly what I have been arguing for is "we should not punish people who try or say good things about [circling, meditation, etc.] for being epistemically reckless, or allege that they're evil and manipulative solely on that basis, because I think there are important potential benefits worth the potential risks for some people." I still think you're over-updating on school. For example, why do graduate students have advisors? At least in fields like pure mathematics that don't involve lab work, it's plausibly because being a researcher in these fields requires important mental skills that can't just be learned through reading, but need to be absorbed through periodic contact with the advisor. Great advisors often have great students; clearly something important is being transmitted even if it's hard to write down what. My understanding of CFAR's position is also that whatever mental skills it tries to teach, those skills are much harder to teach via text or even video than via an in-person workshop, and that this is why we focus so heavily on workshops instead of methods of teaching that scale better. I know, right? Also ironically, learning how to not be subject to my triggers (at least, not as much as I was before) is another skill I got from circling.
[-]gjm6y120

I'm glad you got over the initial triggered-ness. I did wonder about being even more explicit that I don't in fact think you guys are losing your minds, but worried about the "lady doth protest too much" effect.

I wasn't (in case it isn't obvious) by any means referring specifically to you, and in particular the "if it keeps happening I'm going to leave" wasn't intended to be anything like a quotation from you or any specific other person. It was intended to reflect the fact that a number of people (I think at least three) of what I called the Berkeley School have made comments along those general lines -- though I think all have taken the line you do here, that the problem is a norm of uncharitable pushback rather than being personally offended. I confess that the uncharitably-pushing-back part of my brain automatically translates that to "I am personally offended but don't want to admit it", in the same way as it's proverbially always correct to translate "it's not about the money" to "it's about the money" :-).

(For the avoidance of doubt, I don't in fact think that auto-transl... (read more)

8Qiaochu_Yuan6y
Not sure how to quantify this. I also haven't had a mystical experience myself, although I have experienced mildly altered states of consciousness without the use of drugs. (Which is not at all unique to dabbling in mysticism; you can also get them from concerts, sporting events, etc.) I imagine it's comparable to the amount of ability needed to avoid taking a strong drug experience at face value while having it (esp. since psychoactive drugs can induce mystical experiences). I want to make a distinction between telling people what trade-offs I think they should be making (which I mostly can't do accurately, because they have way more information than I do about that) and telling people I think the trade-offs they're making are too extreme (based on my limited information about them + priors). E.g. I can't tell you how much your time is worth in terms of money, but if I see you taking on jobs that pay a dollar an hour I do feel justified in claiming that probably you can get a better deal than that. Yes, this is probably true. I don't think you need to go very far in the mystical direction per se to get the benefits I want rationalists to get. Again, it's more that I think there are some important skills that it's worth it for rationalists to learn, and as far as I can tell the current experts in those skills are people who sometimes use vaguely mystical language (as distinct from full-blown mystics; these people are e.g. life coaches or therapists, professionally). So I want there to not be a meme in the rationality community along the lines of "people who use mystical language are crazy and we have nothing to learn from them," because I think people would be seriously missing out if they thought that. That's not clear to me because of blindspots. Consider the Sequences, for example: I think we can agree that they're in some sense psychoactive, in that people really do change after reading them. What kind of epistemic risks did we take on by doing that? It's unc
2Said Achmiz6y
Why do you (or I, or anyone else) need mysticism (either of the sort you’ve talked about, or whatever Jordan Peterson talks about) in order to have motivation and meaning? In my experience, it is completely unnecessary to deviate even one micrometer from the path of epistemic rectitude in order to have meaning and motivation aplenty. (I, if anything, find myself with far too little time to engage in all the important, exciting projects that I’ve taken on—and there is a long queue of things I’d love to be doing, that I just can’t spare time for.) (Perhaps partly to blame here is the view—sadly all too common in rationalist circles—that nothing is meaningful or worth doing unless it somehow “saves the world”. But that is its own problem, and said view quite deserves to be excised. We ought not compound that wrong by indulging in woo—two wrongs don’t make a right.) You do a disservice to that last point by treating it as a mere parenthetical; it is, in fact, crucial. If the tools in question are epistemically beneficial—if they are truth-tracking—then we ought to master them and use them. If they are not, then we shouldn’t. Whether the tools in question can be used “safely” (that is, if one can use them without worsening one’s epistemics, i.e. without making one’s worldview more crazy and less correct); and, conditional on that, whether said tools meaningfully improve our grasp on reality and our ability to discover truth—is, in fact, the whole question. (To me, the answer very much seems to be a resounding “no”. What’s more, every time I see anyone—“rationalist” or otherwise—treat the question as somehow peripheral or unimportant, that “no” becomes ever more clear.)
5Qiaochu_Yuan6y
I have said this to you twice now and I am going to keep saying it: are we talking about whether mysticism would be useful for Said, or useful for people in general? It seems to me that you keep making claims about what is useful for people in general, but your evidence continues to be about whether it would be useful for you. I consider myself to be making a weak claim, not "X is great and everyone should do it" but "X is a possible tool and I want people to feel free to explore it if they want." I consider you to be making a strong claim, namely "X is bad for people in general," based on weak evidence that is mostly about your experiences, not the experiences of people other than you. In other words, from my perspective, you've consistently been typical minding every time we talk about this sort of thing. I'm glad that you've been able to find plenty of meaning and motivation in your life as it stands, but other people, like me, aren't so lucky, and I'm frustrated at you for refusing to acknowledge this. The parenthetical was not meant to imply that the point was unimportant, just that it wasn't the main thrust of what I was trying to say.
5Said Achmiz6y
Why do you say it’s luck? I didn’t just happen to find these things. It took hard work and a good long time. (And how else could it be? —except by luck, of course.) I’m not refusing to acknowledge anything. I do not for a moment deny that you’re advocating a solution to a real problem. I am saying that your solution is a bad one, for most (or possibly even “all”) people—especially “rationalist”-type folks like you and I are. And I am saying that your implication—that this is the best solution, or maybe even the only solution—is erroneous. (And how else to take the comment that I have been lucky not to have to resort to the sort of thing you advocate, and other comments in a similar vein?) So, to answer your question: I, at least, am saying this: of course these things would not be useful for me; they would be detrimental to me, and to everyone, and especially to the sorts of people who post on, and read, Less Wrong. Is this a strong claim? Am I very certain of it? It’s not my most strongly held belief, that’s for sure. I can imagine many things that could change my mind on this (indeed, given my background[1], I start from a place of being much more sympathetic to this sort of thing than many “skeptic” types). But what seems to me quite obvious is that in this case, firm skepticism makes a sensible, solid default. Starting from that default, I have seen a great deal of evidence in favor of sticking with it, and very little evidence (and that, of rather low quality) in favor of abandoning it and moving to something like your view. So this is (among other reasons) why I push for specifics when people talk about these sorts of things, and why I don’t simply dismiss it as woo and move on with my life (as I would if, say, someone from the Flat Earth Society were to post on Less Wrong about the elephants which support the world on their backs). It’s an important thing to be right about. The wrong view seems plausible to many people. It’s not so obviously wrong that w

I am going to make one more response (namely this one) and then stop, because the experience of talking to you is painful and unpleasant and I'd rather do something else.

And I am saying that your implication—that this is the best solution, or maybe even the only solution—is erroneous.

I don't think I've said anything like that here. I've said something like that elsewhere, but I certainly don't mean anything like "mysticism is the only solution to the problem of feeling unmotivated" since that's easy to disprove with plenty of counterexamples. My position is more like:

"There's a cluster of things which look vaguely like mysticism which I think is important for getting in touch with large and neglected parts of human value, as well as for the epistemic problem of how to deal with metacognitive blind spots. People who say vaguely mystical things are currently the experts on doing this although this need not be the case in principle, and I suspect whatever's of value that the mystics know could in principle be separated from the mysticism and distilled out in a form most rationalists would be happy with, but as far as I know that work mostly hasn't been done yet. Feeling more motivated is a side effect of getting in touch with these large parts of human value, although that can be done in many other ways."

2TAG6y
It seems tautologous to me that if thing A is objectively more important than thing B, then, all other things being equal, you should be doing thing A. Mysticism isn't a good fit for the standard rationalist framing of "everything is ultimately about efficiently achieving arbitrary goals", but a lot of other things aren't either, and the framing itself needs justification.
1Said Achmiz6y
This certainly sounds true, except that a) there’s no such thing as “objectively more important”, and b) even if there were, who says that “saving the world” is “objectively more important” than everything else? Well I certainly I agree with you there—I am not a big fan of that framing myself—but I don’t really understand whether you mean to be disagreeing with me, here, or what. Please clarify.
3dxu6y
Saving the world certainly does seem to be an instrumentally convergent strategy for many human terminal values. Whatever you value, it's hard to get more of it if the world doesn't exist. This point should be fairly obvious, and I find myself puzzled as to why you seem to be ignoring it entirely.
5Said Achmiz6y
Please note that you’ve removed the scare quotes from “saving the world”, and thus changed the meaning. This suggests several possible responses to your comment, all of which I endorse: 1. It seems likely, indeed, that saving the world would be the most important thing. What’s not clear is whether ‘“saving the world”’ (as it’s used in these sorts of contexts) is the same thing as ‘saving the world’. It seems to me that it’s not. 2. It’s not clear to me that the framework of “the world faces concrete threats X, Y, and Z; if we don’t ‘save the world’ from these threats, the world will be destroyed” is even sensible in every case where it’s applied. It seems to me that it’s often misapplied. 3. If the world needs saving, is it necessary that all of everyone’s activity boil down to saving it? Is that actually the best way to save the world? It seems to me that it is not.

If you really think Jordan Peterson is worth inducting into the rationalist hall of fame, you might as well give up the entire rationalist project altogether. The problem is not merely that Peterson is religious and a social conservative, but that he is a full-blown mystic and a crackpot, and his pursuit of "metaphorical truth" necessarily entails bad methodology and a lack of rigor that leads to outright falsehoods.

Take, for example, his stated belief that the ancient Egyptians and Chinese depicted the double helix structure of DNA in their art. (In another lecture he makes the same claim in the context of Hindu art)

Or his statement suggesting that belief in God is necessary for mathematical proof.

Or his "consciousness creates reality" quantum mysticism.

Or his use of Jung, including Jung's crackpot paranormal concept of "synchronicity".

As a trained PhD psychologist, Peterson almost certainly knows he's teaching things that are unsupported, but keeps doing it anyway. Indeed, when someone confronted him about his DNA pseudo-archaeology, he started backpedaling about how strongly he believed it- though he also went on to speculate about wheth... (read more)

2alkjash6y
To respond to this without diving into the culture wars demon thread: (1) The DNA claim I agree is absurd, though not nearly as absurd as you make it out to be. Certainly Democritus proposed the existence of atoms long before we had anything like microscopes. It's not inconceivable that ancient people could have deduced mathematical efficiencies of the double helix structure empirically and woven that into mythological stories, and some of these mathematical efficiencies are relevant reasons for DNA being actually the way it is. I think the DNA claim is basically a rare false positive for an otherwise useful general cognitive strategy, see (4) below. As for the backpedaling and the ESP: what you call backpedaling looks to me like "giving a more accurate statement of his credence on request," which is fine. The ESP thing is actually a statement about the brilliant and unexpected insights from psychedelics. I'm personally somewhat skeptical about this but many many rationalists have told me that LSD causes them to be life-changingly insightful and is exactly what I need in life. (2) Belief in God is something that needs to be disentangled about Peterson, he always hesitates to state he "believes in God" for exactly the reason of being misinterpreted this way. The closest thing to what he means by "faith in God" that I can express is "having a terminal value," and that statement translates to "human beings cannot be productive (including create mathematics) without a terminal value," i.e. you cannot derive Ought from Is. (3) Peterson is not confusing the Copenhagen Interpretation with Wheeler's interpretation, but saying he believes Wheeler's interpretation is the most metaphorically true one. Independently of the quantum mechanics, which I don't think he has a strong side in, he's saying something like "conscious attention is so powerful as a tool for thinking that it might as well literally transform reality." Then the quantum mechanics shenanigans are basically

Death of the Author, but iirc Scott mentioned the point of the Kabbalah in Unsong is the exact opposite-- you can connect anything to anything if you try hard enough, so the fact that you can is meaningless.

Of course, this shows the exact problem with using fiction as evidence.

3alkjash6y
Sorry, I didn't mean to imply that Scott believed the thing. What I think is that he has particularly strong subtle-pattern-noticing ability and this explains both the contents of UNSONG and the fact that he's such a great and lucid writer. This is a sort of Fallacy of Gray. Some connections are much stronger than others, and connections that jump out between core mythological structures that have lasted across thousands of years deserve attention.
5habryka6y
Yes, but I think Scott is very weary of exactly his ability (and other people's ability) to draw connections between mostly unrelated things, and if he thinks that it's still an important part of rationality, my model of Scott still thinks that skill should be used with utmost care, and its misapplication is the reason for a large part of weird false things people come to believe.
3habryka6y
Yeah, that was also my interpretation.
2vedrfolnir6y
Oh, crypto-Discordianism. I haven't read Unsong, but does the Law of Fives show up anywhere?
4mako yass6y
1) What mathematics are you referring to? Does Peterson know it? I'd always just assumed that dna is helical because... it is connected by two strands, and those strands happen to rotate a bit when they connect to each base pair, due to some quirk of chemistry that definitely isn't something you'd ever want to discuss in art unless you knew what DNA was. It's conceivable that some ancient somewhere did somehow anticipate that life would contain strands of codings, but why would they anticipate that every strand would be paired with a mirror? 2) But telos has nothing to do with deities, and belief/intuition that it does is a really pernicious delusion. What is this supposed to explain or excuse? It's just another insane thing that a person would not think if they'd started from sound premises. I never really doubted that there would be some very understandable, human story behind how Jordan Peterson synthesised his delusions. I am not moved by hearing them.
4TurnTrout6y
It's also not literally inconceivable that someone in Egypt formulated and technically solved the alignment problem, but I wouldn't put odds on that of more than 1×10−7. Yes, I am prepared to make a million statements with that confidence and not expect to lose money to the gods of probability. This seems motte-and-bailey. If that's what he means, shouldn't he just advance "terminal values are necessary to solve Moore's open question"? I feel like throughout the comments defending Peterson, the bottom line has been written first and everything else is being justified post facto.
5alkjash6y
When I say inconceivable I don't mean literally inconceivable. People have done some pretty absurd things in the past. What is your subjective probability that the most prolific mathematician of all time did half of his most productive work after going blind in both eyes? I can't speak for others, but I have spent hundreds of hours thinking about Peterson's ideas and formulating which parts I agree with and why, including almost every argument that has been put forth so far. Me going back retrospectively and extracting the reasons I made each decision to believe what I believe will look from the outside just like "writing down the bottom line and justifying things post facto." In general I don't think the bottom line fallacy is one that you're supposed to use on other people's reasoning.
What is your subjective probability that the most prolific mathematician of all time did half of his most productive work after going blind in both eyes?

That's surprising but not that surprising: Milton wrote much of his best poetry while blind, and Beethoven was famously deaf. Conversely, I cannot think of a single unambiguous example of a mythological motif encoding a non-obvious scientific truth (such as that nothing can go faster than light, or that all species evolved from a single-celled organism, or that the stars are trillions and trillions of miles away), so I think this is very very unlikely.

3TurnTrout6y
I wasn’t saying that unusual things can’t happen. I should have made myself clearer - what I was getting at was with respect to claims that ancient societies managed to spontaneously derive properties of things they were, in fact, literally incapable of observing. That smells like a second law of thermodynamics-violating information gain to me. The assertion I’m making is not that Peterson is bad, or that he never has amazing insights, etc. My point is purely with respect to putting him on Eliezer’s level of truth-seeking and technical rationality. Having been wrong about things does not forever disbar you from being a beisutsukai master. However, if one is wrong about important things, becomes aware of the methods of rationality (as I imagine he has), thinks carefully, and still retains their implausible beliefs - that should be enough to indicate they aren’t yet on the level required. On the other hand, I notice I am confused and that I am disagreeing with people whom I respect very much. I’m happy to update on any new information, but I have a hard time seeing how I could update very far on this particular claim, given that he is indeed quite religious.
2alkjash6y
Thank you for being charitable. =) Regarding the DNA claim: I think what I'm saying is much weaker than what you think I'm saying. e.g. ancient people discovered how to store information sequentially in a book. DNA stores information sequentially. This is not surprising. Why would it be inconceivable that the double helix structure is not uniquely weird about DNA? My steelman of Peterson's claim about DNA is not that ancient people knew what DNA was or were making any attempt to map it, but that there might be some underlying mathematical reason (such as high compressibility) that the double helix structure is amenable to information storage, and also simultaneously makes it a good mythological motif. This seems to be only 1 in 100 or 1 in 1000 surprising to me. Here's a bit of what it means to be "real" in Peterson's pragmatic sense, expanding on another comment: Atoms are real. Numbers are real. You might call numbers a "useful metaphor," but numbers are more real than atoms. Part of what I mean by this is: I would be more surprised if the universe didn't obey simple mathematical laws than if it were not made out of atoms. Another part of what I mean is: if I had to choose between knowing about numbers and knowing about atoms, knowing about numbers would be more powerful in guiding me through life. And this is the pragmatic definition of truth. At some point in the distant past people believed the imaginary unit i was not a "real" number. At first, it was introduced as a "useful shorthand" for a calculation made purely in the reals. People noticed, for example, that the easiest way to solve cubic equations like x3+x+1=0 was to go through these imaginary numbers, even if the answer you end up with is real. Eventually, the concept of i became so essential and simplified so many other things (e.g. every polynomial has a root) that its existence graduated from "useful metaphor" to "true." It led to ridiculous things like taking complex exponents, but somehow ph
4TurnTrout6y
I still don’t quite grasp the DNA point, even after multiple reads - how would compressibility make it show up in mythos? I can’t find any non-reddit / youtube source on his statements (Freedom is keeping a patient eye on my browsing habits, as always). I don’t disagree that mathematical truth is, in a certain sense, "higher" than other truths. I’d just like to point out that if I could consistently steelman Eliezer’s posts, I’d probably be smarter and more rational than he (and no, I cannot do this).
2alkjash6y
For the DNA point, I'm drawing on some mathematical intuition. Here are two examples: What if I told you that ancient Egyptian civilizations had depictions of the hyperbolic cosine ex+e−x2 even though they never came close to discovering the constant e? Well, the hyperbolic cosine is also called the catenary, which is the not-quite-parabola shape that all uniformly-weighted chains make if held from their two ends. So of course this shape was everywhere! What if I told you that a physicist who had never studied prime number theory discovered the distribution of the zeros of the Riemann zeta function (that had escaped the attention of number theorists)? It turns out that this is basically how random matrix theory was discovered by Dyson and Montgomery. The point is that mathematically interesting structures show up in not-obviously-connected ways. Now if I could tell you what exactly the structural property of DNA was, then I would actually believe Peterson's claim about it, which I don't. But at least a start to this question is: suppose a thousand genetic life forms evolved independently on a thousand planets. How many mathematically different information storage structures like the double helix would appear? Probably not more than 10, right? Most likely there's something canonically robust and efficient about the way information is packed into DNA molecules. Re: steelmanning. Really what I'm doing is translating Peterson into language more palatable to rationalists. Perhaps you could call this steelmanning.
4TurnTrout6y
I'd agree with this claim, but it feels pretty anthropically-true to me. If it weren't the case, we wouldn't be able to exist. Once understood, chains of reasoning should (ideally) be accepted or rejected regardless of their window dressing. I may be turned off by what he says due to his mannerisms / vocabulary, but once I take the time to really understand what he's claiming... If I still find his argumentation lacking, then rephrasing it in an actually-more-defensible way is steelmanning. I haven't taken that time (and can't really, at the moment), but I suspect if I did, I'd still conclude that there is no steel strong enough to construct a beisutsukai out of someone who believes in god.
3alkjash6y
The crux of the matter is that he believes in God then? I'll also let him speak for himself, but as far as I can tell he doesn't by your definition of believe in God. Furthermore, I've always been an atheist and not changed any object-level beliefs on that front since I can remember, but I think with respect to Peterson's definitions I also believe in God.
3dsatan6y
Parent commenter is doing some pretty serious cherry picking. 2) and 3) can basically be ignored. 2) comes from a 2013 deleted tweet which the parent commenter has pulled off of archive, and 3) from a 2011 debate which is anyways misrepresented by the parent commenter. He never lays out something that can unambiguously be taken to be quantum mysticism, even though he starts out talking about copenhagen. "consciousness creates reality" does actually correspond to a reasonable position which can be found by being a little charitable and spending some time trying to interpret what he says. 1 and 4 depend on his rather complex epistemology, "I really do believe this though it is complicated to explain," he prefaces the DNA comment with. I would be much more concerned if something like 2) were something he repeated all the time rather than promptly deleted, and was central to some of his main theses.
0Erfeyah6y
I would like to focus on a minor point in your comment. You say: The structuring of your sentence implies a world view in which mystics and religious seekers are the same as pseudoscientists and are obviously 'misled'. Before that you are putting the word 'mystic' next to 'crackpot' as if they are the same thing. This is particularly interesting to me because an in depth rational examination of mystical material, in conjunction with some personal empirical evidence, indicate that mystical experiences exist and have a powerful transformative effect on the human psyche. So when I hear Peterson taking mysticism seriously I know that I am dealing with a balanced thinker that hasn't rejected this area before taking the necessary time to understand it. There are scientists and pseudo-scientists, religious seekers and pseudo religious seekers and, maybe, even mystics and pseudo-mystics. I know this is hard to even consider but how can you rationally assess something without taking the hypothesis seriously?
2TurnTrout6y
that’s a pretty strong claim. Why would your priors favor "the laws of physics allow for mystical experiences" over "I misinterpreted sensory input / that’s what my algorithm feels like from the inside, I guess"?
5Vaniver6y
Why are you contrasting "mystical experiences" and "that's what my algorithm feels like from the inside"? It's like claiming consciousness has to be non-material.
2TurnTrout6y
I don’t follow. Mystical experience implies ontologically basic elements outside the laws of physics as currently agreed upon. I’m asserting that mystical experiences are best explained as features of our algorithms.
5Qiaochu_Yuan6y
Why? I don't need to have any particular interpretation of a mystical experience to have a mystical experience. Map-territory errors are common here but they certainly aren't inevitable.
3TurnTrout6y
I suspect I have a different understanding of "mystical experience" than you do - how would you define it?
5Qiaochu_Yuan6y
There's a cluster of experiences humans have had throughout history, which they've talked about using words like "seeing God" or "becoming one with the universe" (but again, let's carefully separate the words from a particular interpretation of the words), and that have been traditionally associated with religions, especially with people who start religions. They can be induced in many ways, including but not limited to meditation, drugs, and sex. Fuller description here.
3TurnTrout6y
Sure, I’d agree that those sensations can be very real. Thanks for the explanation - I had read the term as "mystical experiences and their implied physical interpretations are real".
1TAG6y
I don't see why. "Oness with the universe" is a fact implied by physcialism -- we are not outside observers. Conscious awareness of OWTU is not implied by physicalsim, but that's because nothing about consciousness is implied by physicalism.
1dsatan6y
When is something a misinterpretation of sensory input? When the interpretation is not rendered in terms of the laws of physics which your alternative implies or...? A better hypothesis is "in a metaphysics which takes Being as primary, which is not in any way contrary to science (since science does not imply a metaphysics like scientific realism or reductive and eliminative materialism), mystical experience is permissible and not contrary to anything we know".
2TurnTrout6y
That’s a long way of saying "theory with a strictly greater complexity and exponentially smaller prior probability than reductionism"
1dsatan6y
Crushing what I say into some theory of bayesian epistemology is a great way of destroying the meaning of what I say. But to try to fit it into your theory without losing as much information as your attempt: humans, by the evolved structure of our brains, especially by the nature of human perception and decision making, have a built in ontology - the way we cut out things in our perception as things, and the way we see them as being things which are relevant to our involvements in the world. You can't get rid of it, you can only build on top of it. Mistakenly taking reductionistic materialism as ontology (which is not an action you can take short of completely changing the fundamental structure of your brain) only adds its complexity on top of the ontology that is already there. It's like using a windows emulator to do everything instead of using the OS the emulator is running in. If you tried to turn your statement into an actual mathematical statement, and tried to prove it, you would see that there is a large gap between the mathematics and the actual psychology of humans, such as yourself.
2TurnTrout6y
I wasn’t trying to be rude, I just thought you were claiming something else entirely. My apologies. I still don‘t understand the point you’re making with respect to mystical experiences, and I’d like to be sure I understand before giving a response.
0Erfeyah6y
I can offer a couple of points on why I consider it a subject of great significance. [1] On a personal level, which you are of course free to disregard as anecdotal, I had such an experience myself. Twice to be precise. So I know that the source is indeed experiential ("mystical experiences exist") though I would not yet claim that they necessarily point to an underlying reality. What I would claim is that they certainly need to be explored and not disregarded as a 'misinterpretation of sensory input'. My personal observation is that (when naturally occurring not chemically induced!) they accompany a psychological breakthrough through an increase in experiential (in contrast to rational) knowledge. [2] Ancient foundational texts of major civilizations have a mystical basis. Good examples are the Upanishads and the Teo Te Ching but the same experiences can be found in Hebrew, Christian and Sufi mystics, the Buddha, etc. A look at the evidence will immediately reveal that the experience is common among all these traditions and also seems to have been reached independently. We can then observe that this experience is present in the most ancient layers of our mythological structures. The attempt of abstracting the experience into an image can be seen, for example, in symbols such as the Uroboros which point to the underlying archetype. The Uroboros, Brahman and the Tao are all different formulations of the same underlying concept. If we then take seriously Peterson's hypothesis about the basis of morality in stories things get really interesting; but I am not going to expand on that point here. These are by no means the only reasons. Indeed the above points seem quite minor when viewed through a deeper familiarity with mystical traditions. But we have to start somewhere I guess.

Jordan Peterson certainly has a strong and appealing idea about what went wrong. But I think Eric Hoffer (a similar character half a century ago) already answered that question pretty much. And when I try to find examples that put their views into contrast, Hoffer wins.

For example, in this video Peterson gives one of his most powerful phrases: "Don't use language instrumentally!" The idea is that, if you allow yourself to twist your words away from what's perfectly truthful, you gradually begin to think like that too, leading straight to the horrors of fascism and communism. It all sounds very convincing.

But then I remember every product manager I've had as a programmer. They were all happy, well-adjusted people who had this incredible skill at using language instrumentally - convincing various decision makers to go along, sometimes not in perfectly honest ways. They all seemed to have learned that skill with their mother's milk, and it hasn't hurt them at all!

Hoffer wouldn't be surprised by that. His only message is that you should have self-esteem, and not join any mass movements to compensate for lack of self-esteem. If you can manage that, it's okay to be a liar or cy... (read more)

6Jacob Falkovich6y
I decided to squeeze my discussion of Hoffer's frustration into a single paragraph that includes a link to an essay about True Believer, so it wouldn't take over a post that was already getting long. If you've read Lou Keep's review on Samzdat, do you think it's worth spending the time to read True Believer itself? As a big proponent of Horseshoe Theory, I actually find Peterson very disappointing in this regard. He treats the Red mass movement (the alt-right) as misguided souls who just need a little nudge and a discount to selfauthoring.com to become great citizens. But similar young people who joined the Blue mass movement because of a contingency like skin color are, to JBP, evil fanatics in the service of a murderous ideology. Of course, as Hoffer notes, creating a scary boogeyman is a great way to fuel the fire of the worst kinds of mass movements. I think Peterson is making the alt-right worse, not defusing them. I find self-help Peterson to be useful, Bible study Peterson to be interesting, but culture war Peterson is net harmful.
6cousin_it6y
Yes! Hoffer's book is as clear as humanly possible, while Lou Keep's review is more impressionistic. I think reading second-hand impressions of Hoffer is like reading second-hand impressions of Machiavelli. There's no way they can come close to the real thing.
6Jacob Falkovich6y
Oh man, I guess this means I have to actually read The Prince now too. I should have known better than to ask!
1Chris_Leong6y
Do you have any opinions on what comes closest if I just want a quick summary? Unfortunately, I found Samzdat's article hard to follow due to the style.
3cousin_it6y
The summary on Wikipedia is good.
4Chris_Leong6y
I suppose that many people are less worried about the alt-right because they are very much a fringe movement, even on the right and even with the Trump presidency. But further than that, his opinion has probably been shaped by how he has had more success in turning people away from the alt-right, than from ideological forms of social justice (he has talked about how he has received many letters from people who said that they were drawn towards the alt-right until they started reading his content).

I've had trouble making up my mind about Jordan Peterson, and this post was enormously helpful in clarifying my thinking about him. Also:

A new expansion just came out for the Civilization 6 video game, and instead of playing it I’m nine hours into writing this post and barely halfway done. I hope I’m not the only one getting some meaning out of this thing.

This resulted in me updating heavily for the amount of effort involved in writing great content.

This post took about 13 hours, and I didn't even edit the first draft much. Just imagine how long great content would take!

On the other hand, from a couple of conversations I've had with Scott he seems to write much faster and with almost no editing needed. Something like this might take him 3-4 hours in a single sitting. I've only been writing seriously for a couple of years - maybe writers get faster with time, and maybe Scott is just in a different class in terms of talent.

9arundelo6y
George H. Smith said something once, maybe in an email discussion group or something. I can't find it now but it was something along the lines of: When he first started writing he did the standard thing of writing a first draft then rewriting it. But after spending years writing a large quantity of (short) complete pieces, many of them on a deadline, he got so he could usually just write it right the first time through—the second editing pass was only needed to fix typos.
2arundelo6y
I think I found what I was thinking of! It wasn’t George H. Smith, it was Jeff Riggenbach. Smith published it in a “short-lived online zine” of his and reposted it here. (It’s a review of Ayn Rand’s The Art of Nonfiction. Be warned that the formatting isn’t quite right—block quotes from the book are not formatted differently from the text of the review.) A couple excerpts: ---

Commenting note: this post is subject to LesserWrong frontpage moderation rules, but I want to offer my own guidelines, in line with Putanumonit policy.

I'm all up for Crocker's Rules - if you want to call me a moron please don't waste space sugarcoating it. Jordan Peterson is probably also beyond caring if you insult him. However, this doesn't extend to anyone else mentioned in the post (like Scott A and like Scott A), or to any other commenters.

With that said - don't be a fool. Make some effort not to confuse my own opinions with ... (read more)

Meta-comment for authors: take some time after each post to update on how actually contrarian your positions are. As far as I can tell the response to Jordan Peterson on LessWrong has been uniformly positive.

I sense that there are a lot of reasonable people with good ideas like yourself who feel reluctant to share "controversial" views (on e.g. fuzzy System 1 stuff) because they feel like embattled contrarians. Of course, this is probably correct in whatever other social sphere you get your training data from. However, the whole "please be r... (read more)

I've gotten a much more negative reception to fuzzy System 1 stuff at IRL LW meetups than online -- that could be what's going on there.

And it's possible for negative reception to be more psychologically impactful and less visible to outsiders than positive reception. This seems especially likely for culture war-adjacent topics like Jordan Peterson. Even if the reception is broadly positive, there might still be a few people who have very negative reactions.

(This is why I'm reluctant to participate in the public-facing community nowadays -- there were a few people in the rationalist community who had very negative reactions to things I said, and did things like track me down on Facebook and leave me profanity-laden messages, or try to hound me out of all the circles they had access to. With a year or two of hindsight, I can see that those people were a small minority and this wasn't a generally negative reaction. But it sure felt like one at the time.)

I just want to make it clear that sending other users on the page insulting or threatening messages is not cool, and that if anyone else ever experiences that, please reach out to me and I will be able to give you a bit of perspective and potentially take action against the person sending the messages (if they’ve done that repeatedly).

4Charlie Steiner6y
I don't have a positive reaction to Jordan Peterson - I wouldn't call liking him contrarian, but it's at least controversial. To me, he just seems like a self-help media personality shaped by slightly different selection pressure.
1vedrfolnir6y
Jordan Peterson is controversial, but "controversial" is an interesting word. Is Paul Krugman controversial?
All of those frameworks are fake in the sense that introvert isn’t a basic physical entity the same way an up quark is.

The reductive materialism implicit in this is as fake as introverts - possibly even more fake because unless you have a particle accelerator on hand, "everything is made of quarks" translates 100% to hypotheticals rather than anything you can actually do or see in the world; and in the presence of a particle accelerator, that 100% is reduced by epsilon.

2dsatan6y
I think one of the biggest things Peterson has to offer is a way out of many of the fake frameworks that rationalists hold, by offering a fake framework which takes Being as primary, and actually being able to deal with Being directly (which becomes possible with a fake framework which permits the concept of being) is a pathway to Looking.
5vedrfolnir6y
What does "deal with Being directly" mean?
I think that Peterson overgeneralizes about gay men (and what about lesbians?), and he’s wrong about the impact of gay marriage on society on the object level. I’m also quite a fan of promiscuity, and I think it’s stupid to oppose a policy just because “neo-Marxists” support it.

Any political issue can be analyzed on the policy level or on the coalition level. Gay marriage seems like an example of an issue that has less to do with policy and more to do with coalitions. If gay marriage was about policy, people would not draw a meaningful distinction betwee... (read more)

3gjm6y
In what sense was gay marriage "first championed by [...] Andrew Sullivan"?
5John_Maxwell6y
Wikipedia describes Sullivan's article as "the first major article in the United States advocating for gay people to be given the right to marry". Vox Reason Magazine Sullivan himself
5gjm6y
The things you quote don't claim he was first, they just say he was early (which, indeed, he was; I wasn't disputing that). It does indeed appear that Johann Hari says that in 1989 he wrote the "first major article" in the US arguing for same-sex marriage. But, for instance, in the Wikipedia article about the history of same-sex marriage in the US we find: and and On Wikipedia's timeline of same-sex marriage (which incidentally doesn't mention Sullivan's article) we find that in 1975 some same-sex marriage licences were actually issued in Colorado! (But they got blocked.) Perhaps Sullivan was the first major conservative pundit to argue for same-sex marriage in the US, or something like that. Good for him! But he wasn't the first person to champion it, not by a long way.
6John_Maxwell6y
I acknowledge others were talking about it earlier, but I think "first major conservative pundit" is an understatement. Tyler Cowen called Sullivan the most influential public intellectual of the past 20 years, largely due to the influence he had on gay marriage.
3Vaniver6y
Here's Freedom to Marry's history of the subject. Wolfson in 1983 is definitely earlier.
[-]gjm6y20

Something's gone screwy with the formatting somewhere around the "Untitled" link, as a result of which the entire end of the post and all the comments are in italics. Jacob, perhaps you can fix whatever's broken in the post? LW2 mods, perhaps you can fix whatever's wrong in the rendering code that enables formatting screwups' effects to persist like that?

2gjm6y
Huh, weird. My comment isn't all italicized, even though others before it and after it are. Perhaps because there are actual italics in it? Or maybe it all looks this way only to me and no one else is seeing the spurious italics? [EDITED to add:] I reloaded the page and then my comments were italicized like everyone else's. I seem to recall that there's some quirk of the way formatting is done on LW2 that means that just-posted comments are processed slightly differently from older ones, or something; perhaps that's responsible. [EDITED again to add:] All seems to be fixed now. Thanks to Jacob and/or the site admins.
4habryka6y
Sorry for that! I fixed it for Jacob. One of the draft-js plugins we use appears to have some bugs related to links that are formatted, and sometimes doesn't properly close the HTML-styling tag for stuff like italics, in the middle of a link. For other authors: If you see some weird formatting in your post, it's probably because you applied some styling in the middle of a link. Removing it for now will fix it, I should really get around to submitting a PR that fixes this systematically.
2gjm6y
... Those two comments of mine have both been downvoted at least once. If there's someone reading this who genuinely thinks that pointing out formatting screwage makes the world worse, I would (absolutely sincerely) love to understand why.
2philh6y
(It's happening again, starting in your "The things you quote" comment.)

Metaphysical truth here describes self-fulfilling truths as described by Abram Demski, and whose existence are garanteed by e.g. Löb's theorem. In other words, metaphysical truth is truth, and rationalists should be aware of them.

[Note: somewhat taking you up on the Crocker's rules]

Peterson's truth-seeking and data-processing juice is in super-heavy weight class, comparable to Eliezer etc. Please don't make the mistake of lightly saying he's "wrong on many things".

At the level of analysis in your post and the linked Medium article, I don't think you can safely say Peterson is "technically wrong" about anything; it's overwhelmingly more likely you just didn't understand what he means. [it's possible to make more case-specific arguments here but I think the outside view meta-rationality should be enough...]

If you want to me to accept JBP as an authority on technical truth (like Eliezer or Scott are), then I would like to actually see some case-specific arguments. Since I found the case-specific arguments to go against Peterson on the issues where I disagree, I'm not really going to change my mind on the basis of just your own authority backing Peterson's authority.

For example: the main proof Peterson cites to show he was right about C-16 being the end of free speech is the Lindsay Shepherd fiasco. Except her case wasn't even in the relevant jurisdiction, which the university itself admitted! The Shepherd case was about C-16, but no one thinks (anymore) that she was in any in violation of C-16 or could be punished under it. I'll admit JBP was right when Shepherd is dragged to jail by the secret police.

Where I think Peterson goes wrong most often is when he overgeneralizes from the small and biased sample of his own experience. Eating nothing but chicken and greens helped cure Peterson's own rare autoimmune disease, so now everyone should stop eating carbs forever. He almost never qualifies his opinions or the advice he gives, or specifies that it only applie... (read more)

6SquirrelInHell6y
[Please delete this thread if you think this is getting out of hand. Because it might :)] See right here, you haven't listened. What I'm saying is that there is some fairly objective quality which I called "truth-seeking juice" about people like Peterson, Eliezer and Scott which you can evaluate by yourself. But you are just dug yourself into the same trap a little bit more. From what you write, your heuristics for evaluating sources seem to be a combination of authority and fact-checking isolated pieces (regardless of how much you understand the whole picture). Those are really bad heuristics! The only reason why Eliezer and Scott seem trustworthy to you is that their big picture is similar to your default, so what they say is automatically parsed as true/sensible. They make tons of mistakes and might fairly be called "technically wrong on many things". And yet you don't care because you when you feel their big picture is right, those mistakes feel to you like not-really-mistakes. Here's an example of someone who doesn't automatically get Eliezer's big picture, and thinks very sensibly from their own perspective: What about if you go read that, and try to mentally swap places. The degree to which Chapman doesn't get Eliezer's big picture is probably similar to the degree to which you don't get Peterson's big picture, with similar results.

I'm worried we may be falling into an argument about definitions, which seems to happen a lot around JBP. Let me try to sharpen some distinctions.

In your quote, Chapman disagrees with Eliezer about his general approach, or perhaps about what Eliezer finds meaningful, but not about matters of fact. I disagree with JBP about matters of fact.

My best guess at what "truth-seeking juice" means comes in two parts: a desire to find the truth, and a methodology for doing so. All three of Eliezer/Scott/JBP have the first part down, but their methodologies are very different. Eliezer's strength is overcoming bias, and Scott's strength is integrating scientific evidence, and I believe they're very good at it because I've seen them do it a lot and be wrong about facts very very rarely. In this post I actually disagree with Eliezer about a matter of fact (how many people before modernity were Biblical literalists), and I do so with some trepidation.

JBP's methodology is optimized for finding his own truth, the metaphorical kind. Like Scott has a track record of being right on science debates, JBP has a track record of all his ideas fitting into a coherent and inspirational worldview - his big picture. When I say he's wrong I don't mean his big picture is bad. I mean he's wrong about facts, and that the Peterson mask is dangerous when one needs to get the facts right.

I notice that my default sense is that Jacob is making a reasonable ask here, but also that Squirrel seems to be trying to do something similar to what I just felt compelled to do on a different thread so I feel obliged to lean into it a bit.

I'm not sure...

a) how to handle this sort of disagreeing on vantage points, where it's hard to disentangle 'person has an important frame that you're not seeing that is worth at least having the ability to step inside' vs 'person is just wrong' and 'person is trying to help you step inside a frame' vs 'person is making an opaque-and-wrong appeal to authority' (or various shades of similar issues).

or, on the meta level:

b) what reasonable norms/expectations on LessWrong for handling that sort of thing are. Err on one side and a lot of people miss important things, err on another side and people waste a lot of time on views that maybe have interesting frames but... just aren't very good. (I like that Jacob set pretty good discussion norms on this thread but this is a thing I'm thinking a lot about right now in the general case).

As of now I have not read anything about Peterson besid... (read more)

6Jacob Falkovich6y
Ray, are you 100% sure that's what is actually going on? Let's introduce some notation, following the OP: there are (at least) two relevant frameworks of truth, the technical, which we'll denote T, and the metaphorical, M. In this community we should be able to agree what T is, and I may or may not be confused about what M is and how it relates to T. I wrote this post specifically to talk about M, but I don't think that's where Squirrel and I are in disagreement. My post explicitly said that I think that Peterson is M.right even though he's T.wrong-on-many-things. Squirrel didn't say they (he? she? ze?) "got some value" out of Peterson in the M-framework. They explicitly said that he's not wrong-on-many-things in the T framework, the same way Eliezer is T.correct. Well, Eliezer told me how to assess whether someone is T.correct - I look at the evidence in the object-level claims. If someone thinks I'm doing T wrong and misapplying rationality, I'm going to need specifics. Ditto if someone thinks that Eliezer is also T.wrong-on-many-things and I don't notice that because I'm deluding myself, So far, I'm the only one who came up with an example of where I think that Eliezer it T.wrong. My point when talking about Squirrel's authority isn't to belittle them, but to say that changing my mind would require a bit more effort, if anyone feels up to it. It should be obvious that my own framework is such that saying "truth juice" is unlikely to move me. I want to be moved! I've been spelling out the details not because I want to fight over C-16 or low carb breakfasts, but to make it easier for people who want to convince me or change my framework to see where the handles are. And I've tried to introduce specific language so we don't talk past each other (Rule 10: be precise in your speech). Of course, that doesn't make me entitled to people's efforts. If you have something more fun to do on a Sunday, no hard feelings :)
9Raemon6y
Nope! (It was my best guess, which is why I used some words like "seems" and "I think that Squirrel is saying") But, sounds from the other comment I got it about right. I agree that persuading someone to step harder into a frame requires a fair bit of effort than what Squirrel has done so far (so far I've never seen anyone convince someone of this sort of thing in one sitting, and always seems to require direct chains of trust, often over years, but I think the art of talking about this usefully has a lot of room for progress)
6SquirrelInHell6y
Frustrating, that's not what I said! Rule 10: be precise in your speech, Rule 10b: be precise in your reading and listening :P My wording was quite purposeful: I think Raemon read my comments the way I intended them. I hoped to push on a frame in people seem to be (according to my private, unjustified, wanton opinion) obviously too stuck in. See also my reply below. I'm sorry if my phrasing seemed conflict-y to you. I think the fact that Eliezer has high status in the community and Peterson has low status is making people stupid about this issue, and this makes me write in a certain style in which I sort of intend to push on status because that's what I think is actually stopping people from thinking here.
6Jacob Falkovich6y
Your reply below says: What exactly did you think I meant when I said he's "technically wrong about many things" and you told me to be careful? I meant something very close to what your quote says, I don't even know if we're disagreeing about anything. And by the way, there is plenty of room for disagreement. alkjash just wrote what I thought you were going to, a detailed point-by-point argument for why Peterson isn't, in fact, wrong. There's a big difference between alkjash's "Peterson doesn't say what you think he says" and "Peterson says what you think and he's wrong, but it's not important to the big picture". If Peterson really says "humans can't do math without terminal values" that's a very interesting statement, certainly not one that I can judge as obviously wrong.
2SquirrelInHell6y
I did in fact have something between those two in mind, and was even ready to defend it, but then I basically remembered that LW is status-crazy and and gave up on fighting that uphill battle. Kudos to alkjash for the fighting spirit.
6gjm6y
I think you should consider the possibility that the not-very-positive reaction your comments about Peterson here have received may have a cause other than status-fighting. (LW is one of the less status-crazy places I'm familiar with. The complaints about Peterson in this discussion do not look to me as if they are primarily motivated by status concerns. Some of your comments about him seem needlessly status-defensive, though.)
3Jacob Falkovich6y
Not to sound glib, but what good is LW status if you don't use it to freely express your opinions and engage in discussion on LW? The same is true of other things: blog/Twitter followers, Facebook likes etc. are important inasmuch as they give me the ability to spread my message to more people. If I never said anything controversial for fear of losing measurable status, I would be foregoing all the benefits of acquiring it in the first place.
1vedrfolnir6y
Getting laid, for one thing. And, you know, LW is a social group. Status is its own reward. High-status people probably feel better about themselves than low-status people do, and an increase in status will probably make people feel better about themselves than they used to. Eric Hoffer was a longshoreman who just happened to write wildly popular philosophy books, but I think he'd agree that that's not terribly usual.
1Jacob Falkovich6y
Yeah, I thought it could be something like that. I don't live in Berkeley, and no woman who has ever slept with me cared one jot about my LW karma. With that said, the kind of status that can be gained or lost by debating the technical correctness of claims JBP makes with someone you don't know personally seems too far removed from anyone's actual social life to have an impact on getting laid one way or another.

Perhaps you can explain what Peterson really means when he says that he really believes that the double helix structure of DNA is being depicted in ancient Egyptian and Chinese art.

What does he really means when he says, "Proof itself, of any sort, is impossible, without an axiom (as Godel proved). Thus faith in God is a prerequisite for all proof."?

Why does he seems to believe in Jung's paranormal concept of "synchronicity"?

Why does he think quantum mechanics means consciousness creates reality, and confuse the Copenhagen interpretation with Wheeler's participatory anthropic principle?

Peterson gets many things wrong - not just technically wrong, but deeply wrong, wrong on the level of "ancient aliens built the pyramids". He's far to willing to indulge in mysticism, and has a fundamental lack of skepticism or anything approaching appropriate rigor when it comes to certain pet ideas.

He isn't an intellectual super-heavy weight, he's Deepak Chopra for people who know how to code. We can do better.

5John_Maxwell6y
Rationalists have also been known to talk about some kooky sounding stuff. Here's Val from CFAR describing something that sounds a lot like Peterson's "synchronicity":
6habryka6y
I would guess that the same people who objected to those paragraphs, also object to similar paragraphs by Peterson (at least I object to both on similar grounds).
2SquirrelInHell6y
Cool examples, thanks! Yeah, these are issues outside of his cognitive expertise and it's quite clear that he's getting them wrong. Note that I never said that Peterson isn't making mistakes (I'm quite careful with my wording!). I said that his truth-seeking power is in the same weight class, but obviously he has a different kind of power than LW-style. E.g. he's less able to deal with cognitive bias. But if you are doing "fact-checking" in LW style, you are mostly accusing him of getting things wrong about which he never cared in the first place. Like when Eliezer is using phlogiston as an example in the Sequences and gets the historical facts wrong. But that doesn't make Eliezer wrong in any meaningful sense, because that's not what he was talking about. There's some basic courtesy in listening to someone's message, not words.
8Gaius Leviathan XV6y
Sorry, but I think that is a lame response. It really, really isn't just lack of expertise-- it's a matter of Peterson's abandonment of skepticism and scholarly integrity. I'm sorry, but you don't need to be a historian to tell that the ancient Egyptians didn't know about the structure of DNA. You don't need to be a statistician to know that coincidences don't disprove scientific materialism. Peterson is a PhD who know the level of due diligence needed to publish in peer reviewed journals from experience. He knows better but did it anyway. He cares enough to tell his students, explicitly, that he "really does believe" that ancient art depicts DNA - repeatedly! - and put it in public youtube videos with his real name and face. It's more like if Eliezer used the "ancient aliens built the pyramids" theory as an example in one of the sequences in a way that made it clear that he really does believe aliens built the pyramids. It's stupid to believe it in the first place, and it's stupid to use it as an example. Then what makes Peterson so special? Why should I pay more attention to him than, say, Deepak Chopra? Or an Islamist Cleric? Or a postmodernist gender studies professor who thinks western science is just a tool of patriarchal oppression? Might they also have messages that are "metaphorically true" even though their words are actually bunk? If Peterson gets the benefit of the doubt when he says stupid things, why shouldn't everybody else? If uses enough mental gymnastics, almost anything can be made to be "metaphorically true". Peterson's fans are too emotionally invested in him to really consider what he's saying rationally - akin to religious believers. Yes, he gives his audience motivation and meaning - much in the same way religion does for other demographics- but that can be a very powerful emotional blinder. If you really think that something gives your life meaning and motivation, you'll overlook its flaws, even when it means weakening your epistemology.
4SquirrelInHell6y
This is what the whole discussion is about. You are setting boundaries that are convenient for you, and refuse to think further. But some people in that reference class you are now denigrating as a whole are different from others. Some actually know their stuff and are not charlatans. Throwing a tantrum about it doesn't change it.
5mako yass6y
Then what the heck do you mean by "equal in truth-seeking ability"?
9gjm6y
(I upvoted that comment, but:) Truth-seeking is more than avoiding bias, just as typing is more than not hitting the wrong keys and drawing is more than not making your lines crooked when you want them straight. Someone might have deep insight into human nature; or outstanding skill in finding mathematical proofs; or a mind exceptionally fertile in generating new ideas, some of which turn out to be right; or an encyclopaedic knowledge of certain fields. Any of those would enhance their truth-seeking ability considerably. If they happen not to be particularly good at avoiding bias, that will worsen their truth-seeking ability. But they might still be better overall than someone with exceptional ability to avoid bias but without their particular skills.