Hello! I am totally new here so please bear that in mind in the event I commit faux pas! I'm a writer who has written a LOT about rationality and when rationality eludes us. These are books like So You've Been Publicly Shamed, Them, The Psychopath Test and The Men Who Stare At Goats, among others.

I don't know if this question has been asked elsewhere, but I'd love to know: has learning to be more rational impacted your everyday lives, in small or (perhaps more interestingly) BIG ways? Have there been occasions when you've put these skills (perhaps learned at a workshop) into practice in your domestic or work lives and the ripples have been surprising/consequential in positive or negative ways?

I'm thinking of writing about what happens when a person learns to be more rational, hence my question.

Hope this is the start of a wonderful thread!

Jon Ronson

New Answer
New Comment

7 Answers sorted by

Zack_M_Davis

430

I do not recommend paying attention to the forum or "the community" as it exists today.

Instead, read the Sequences! (That is, the two-plus years of almost-daily blogging by Eliezer Yudkowsky, around which this forum and "the community" coalesced back in 'aught-seven to 'aught-nine.) Reading and understanding the core Sequences is genuinely life-changing on account of teaching you, not just to aspire to be "reasonable" as your culture teaches it, but how intelligence works on a conceptual level: how well-designed agents can use cause-and-effect entanglements to correlate their internal state with the outside world to build "maps that reflect the territory"—and then use those maps to compute plans that achieve their goals.

Again, read the Sequences! You won't regret it!

Ben Pace

400

I have some difficulty distinguishing personal growth I've experienced due to the culture on LessWrong with that from other parts of society and culture and myself. But here's some things feel substantially downstream of interacting with the ideas and culture in this small intellectual community.

(I imagine others will give very different answers.)

Help me focus more on what I care about, and less on what people and society expect of me.

  • I'm a better classical musician. These days I'm better able to do deliberate practise on the parts of the music I need to improve at. To give a dumb/oversimplified quantitative measure, I'm able to learn-and-memorise pieces of music maybe 5-10x more efficiently. When I was at music school as a teenager, there were pieces of music I liked that I didn't finish memorising for years, because when I was in the practise room I was 'going through the motions' of practise far more than 'actually trying to get better according to my own taste'. In the past weeks and months I've picked up a dozen or so pieces by Bach and others in maybe 5-10 hours of playing each, have memorised each, and am able to play with them more musically and emotionally than before.
  • I did weird things during my undergrad at Oxford that were better for my career than being 'a good student'. The university wanted me to care about things like academic prestige and grades in all of my classes, but I realise that I wasn't very interested in the goals they had for me. The academic setting rarely encouraged genuine curiosity about math and science, and felt fairly suffocating. I focused on finding interesting people and working on side-projects I was excited about, and ended up doing things I think in retrospect were far more valuable for my career than getting good grades.

Help me think about modern technology clearly and practically.

  • Quit social media. Writings and actions by people in the LessWrong intellectual community have helped me more than other public dialogue on the subject, think about how to interact with social media. Zvi (author of many LessWrong sequences) did some very simple experiments on the Facebook newsfeed, and wrote about his experiences with it, in a way that helped me think of facebook as an actively adversarial force, optimised to get me hooked, and fundamentally not something we can build a healthy community on. I found his two simple experiments more informative than anything I've seen come out of academia on the subject. The fact that he quit Facebook cold-turkey without exception, and a few more friends, has caused me to move off it too. I now view all social media on Saturdays in a 2-hour period, and don't write/react on any of it, and think this has been very healthy.
  • Using google docs for meetings. Generally this community has helped me think better using modern technology. One user wrote this post about social modelling, which advised using google docs to have meetings. At work now I regularly, primarily have the meeting conversation inside a google doc, where 3-5 people in a meeting can have many parallel conversations at once. I've personally found this really valuable, both in allowing us to use the time more effectively (rather than one person talking at a time, 5 of us can be writing in different parts of the document at the same time), but also in producing a record of our thought processes, reasoning and decisions, for us to share with others and reflect on months and years down the line.

Help me figure out who I am and build my life.

  • Take bets. I take bets on things regularly. That's a virtue and something respected in this intellectual community. I get to find out I'm wrong / prove other people wrong, and generally move conversations forward.
  • Avoid politics. Overall I think that I've successfully avoided getting involved in politics or building a political identity throughout my teenage and university years, and focused on learning what real knowledge is in areas of science and other practical matters, which I think has been healthy. I have a sense that this means when I will have to build up my understanding of more political domains, I'll be able to keep a better sense of what is true and what is convenient narrative. This is partly due to other factors (e.g. personal taste), but has been aided by the LessWrongian dislike of politics.
  • Learn to trust better. Something about the intellectual honesty and rigour of the LessWrong intellectual community has helped me learn to 'kill your darlings', that just because I respect someone doesn't mean they're infallible. The correct answer isn't to always trust someone, or to never trust someone, but to build up an understanding of when they are trustworthy and when they aren't. (This post says that idea fairly clearly in a way I found helpful.)
  • Lots of other practises. I learned to think carefully from reading some of the fiction and stories written by LessWrongers. A common example is the Harry Potter fanfiction "Harry Potter and the Methods of Rationality", which communicates a lot of the experiences of someone who lives up to the many virtues we care about on LessWrong. There are lots of experiences I could write about, about empirically testing your beliefs (including your political beliefs), being curious about how the world works, taking responsibility, and thinking for yourself. I have more I could say here, but it would take me a while to say it while avoiding spoilers. Nonetheless, it's had a substantial effect on how I live and work and collaborate with other people.
    • Other people write great things too, I won't try to find all of them. This recent one by Scott Alexander I think about a fair amount.

I guess there's a ton of things, the above are just a couple of examples that occurred to me in the ~30 mins I spent writing this answer.

By the way, while we care about understanding the human mind in a very concrete way on LessWrong, we are more often focused on an academic pursuit of knowledge. We recently did a community vote on the best posts from 2018. If you look at the top 10-20 or so post, as well as a bunch of niche posts about machine learning and AI, you'll see the sort of discussion we tend to have best on LessWrong. I don't come here to get 'life-improvements' or 'self-help', I come here much more to be part of a small intellectual community that's very curious about human rationality.

If you look at the top 10-20 or so post, as well as a bunch of niche posts about machine learning and AI, you'll see the sort of discussion we tend to have best on LessWrong. I don't come here to get 'life-improvements' or 'self-help', I come here much more to be part of a small intellectual community that's very curious about human rationality.

I wanted to follow up on this a bit.

TLDR: While LessWrong readers tangentially care a lot about self-improvement, reading forums alone likely won't have a big effect on life success. But that's not really that relevant; the most relevant thing to look at is how much progress the community have done on the technical mathematical and philosophical questions it has focused most on. Unfortunately, that discussion is very hard to have without spending a lot of time doing actual maths and philosophy (though if you wanted to do that, I'm sure there are people who would be really happy to discuss those things).

___

If what you wanted to achieve was life-improvements, reading a forum seems like a confusing approach.

Things that I expect to work better are:

  • personally tailored 1-on-1 advice (e.g. seein
... (read more)

Thanks so much for this - reading now...

gjm

280

[Edit: Turned into an answer after the OP author's reply.]

This isn't exactly an answer to your question, but here's a post from Scott Alexander in 2013 about progress LW had made in the last five years. So it doesn't have the element of personal application that you're after, but it does offer an answer of sorts to the related question "what has LW produced that is of any value?". I have a feeling there's at least one other thing on Scott's blog with that sort of flavour.

Also from Scott (from 2007) and pointing rather in the opposite direction: "Extreme Rationality: it's not that great", whose thesis is that LW-style rationality doesn't bring huge increases in personal effectiveness beyond being kinda-sorta-rational. (But the term "LW-style rationality" there is anachronistic; that post was written before Less Wrong as such was a thing.)

A counterpoint from many years later: "Is rationalist self-improvement real?", suggesting that at least for some people LW-style rationality does bring huge personal benefits, but only after you work at it for a while. I think Scott would actually agree with this.

(None of these things is exactly an answer to your question, which is why this is a comment rather than an answer, but I think all of them might be relevant.)

Those Scott Alexander links are fascinating and just what I was hoping for. Thank you for posting them...

[-]gjm100

In case it isn't clear: the first two are both Scott; the third is a chap called Jacob Falkovich. The thing I linked to is a crosspost here of a post from his own blog. I think Jacob also has at least one other post on the theme of "what has rationality ever done for us?" Maybe I'm thinking of this one.

Also possibly worth a look, if at some point you're in critical mood: Yes, we have noticed the skulls. That one's Scott again, as so many of the best things are :-).

130

Somewhat prosaically: thinking carefully about the implications of questions about the history of life and astrobiology that come up in the comments section here has lead directly through long and winding paths to two publications I am currently in the process of writing up in my scientific career.

Additionally, thinking carefully about such questions here reinforced my impression that such problems are important and was one of several things that kicked me into direct contact with and work within actual astrobiology academia.

DanielFilan

80

I'm not quite sure I'd attribute the whole effect to 'being more rational', but I think exposure to LessWrong explains my choice to study artificial intelligence, and specifically how to ensure that future AI systems are highly reliable, to live in a group house with other LessWrong diaspora members, to sign up for cryonics, to be vegan, to donate 10% of my income to weird charities, and more mundanely to happen to be wearing a LessWrong shirt at the moment.

Seconding career choices, cryonics, and donating money. Became vegan after my exposure to LW, but not sure if the effect was strong. Exposure to LessWrong has also given me a better working model of how to do the thinking thing better. In particular, I am now much much much better at noticing confusion.

Raghu Veer S

30

It definitely has taught me some epistemic humility, and especially after reading contents by people like Eliezer, Gwern, and Scott I realized the amount of introspection that I had to do to be able to come to terms with the knowledge deficit I had. I always had an emotional alignment with their content, but the fact that these guys could think the way they do, and all using the same set of tools that I have has made me less envious and more curious in general.

7 comments, sorted by Click to highlight new comments since:
[-]gjm350

Welcome! I've greatly enjoyed some of your books. (I don't mean that the others were bad, I mean I haven't read them.)

A repeated pattern in your books is this: you identify a group of interestingly strange people, spend some time among them, and then write up your experiences in a way that invites your readers to laugh (gently and with a little bit of sympathy) at them. Is it at all possible that part of your purpose in coming here is to collect material that will help internet-rationalists join the club whose existing members include conspiracy theorists, goat-starers, and psychopaths?

Ha! Ok, there's two things I'd like to say to this appropriately wary comment! First, a lot of my work ISN'T about gently laughing at people - most notably and recently So You've Been Publicly Shamed and my two recent podcasts - The Last Days of August and The Butterfly Effect. They're all very much about empathy. If you want me to provide links please say so. Second, perhaps where this idea differs from some of my earlier stories (like The Men Who Stare At Goats) is that I've spent my whole working life as a rationalist. Like the people on this forum, it's guided me for decades... So it feels very personal...

[-]gjm130

Noted! Also noted, at the risk of passing from "appropriately wary" to "inappropriately wary": you didn't actually say that you're not planning to write a book that presents lesswrongers as weirdos to point and smile at. E.g., what you say is entirely compatible with something that begins "I've thought of myself as a rationalist all my life. Recently I discovered an interesting group of people on the internet who also call themselves rationalists. Join me as we take a journey down the rabbit-hole of how 'rationality' can lead to freezing your head, reading Harry Potter fanfiction, and running away from imaginary future basilisks."

Again, maybe I've now passed from "appropriately wary" to "inappropriately wary". But journalistic interest in the LW community in the past has usually consisted of finding some things that can be presented in a way that sounds weird and then presenting them in a way that sounds weird, and the Richlieu principle[1] means that this is pretty easy to do. I'd love to believe that This Time Is Different; maybe it is. But it doesn't feel like a safe bet.

(I should maybe add that I expect a Jon Ronson book on Those Weird Internet Rationalists would be a lot of fun to read. But of course that's the problem!)

[1] "Give me six lines written by the most honest of men, and I will find something in them with which to hang him." Probably not actually said by Richlieu. More generally: if you take a person or, still more, a whole community, and look for any particular thing -- weirdness, generosity, dishonesty, creepiness, brilliance, stupidity -- in what they've said or written, it will probably not be difficult to find it, regardless of the actual nature of the person or community.

But journalistic interest in the LW community in the past has usually consisted of finding some things that can be presented in a way that sounds weird and then presenting them in a way that sounds weird

Tho there are exceptions worth applauding.

I read the NYT piece about the workshop yesterday, so I understand what you're saying. But I should add that I'm less interested in community dynamics than I am in what happens when a person actively attempts to be more rational. So it's the implementing of the rules that interests me the most... And the ripples that may ensue....

Related: Brienne wrote a really interesting comment about this broader dynamic in journalists and popular, about what stories are available for a writer to tell.

Maybe this is just being cute, I often think of it the other way: if I hadn't been so in need of Less Wrong, Less Wrong wouldn't exist! Any effect it has back on me is just cake.

(This is literally true to the extent that I was among the group of people who were among the early existential risk community that were so confused it drove Eliezer to create what would become LW.)