I recently had the privilege of being a CFAR alumni volunteering at a later workshop, which is a fascinating thing to do, and put me in a position both to evaluate how much of a difference the first workshop actually made in my life, and to see how the workshops themselves have evolved. 

Exactly a year ago, I attended one of the first workshops, back when they were still inexplicably called “minicamps”. I wasn't sure what to expect, and I especially wasn't sure why I had been accepted. But I bravely bullied the nursing faculty staff until they reluctantly let me switch a day of clinical around, and later stumbled off my plane into the San Francisco airport in a haze of exhaustion. The workshop spat me out three days later, twice as exhausted, with teetering piles of ideas and very little time or energy to apply them. I left with a list of annual goals, which I had never bothered to have before, and a feeling that more was possible–this included the feeling that more would have been possible if the workshop had been longer and less chaotic, if I had slept more the week before, if I hadn't had to rush out on Sunday evening to catch a plane and miss the social. 

Like I frequently do on Less Wrong the website, I left the minicamp feeling a bit like an outsider, but also a bit like I had come home. As well as my written goals, I made an unwritten pre-commitment to come back to San Francisco later, for longer, and see whether I could make the "more is possible" in my head more specific. Of my thirteen written goals on my list, I fully accomplished only four and partially accomplished five, but I did make it back to San Francisco, at the opportunity cost of four weeks of sacrificed hospital shifts. 

A week or so into my stay, while I shifted around between different rationalist shared houses and attempted to max out interesting-conversations-for-day, I found out that CFAR was holding another May workshop. I offered to volunteer, proved my sincerity by spending 6 hours printing and sticking nametags, and lived on site for another 4-day weekend of delightful information overload and limited sleep. 

Before the May 2012 workshop, I had a low prior that any four-day workshop could be life-changing in a major way. A four-year nursing degree, okay–I've successfully retrained my social skills and my ability to react under pressure by putting myself in particular situations over and over and over and over again. Four days? Nah. Brains don't work that way. 

In my experience, it's exceedingly hard for the human brain to do anything deliberately. In Kahneman-speak, habits are System 1, effortless and automatic. Doing things on purpose involves System 2, effortful and a bit aversive. I could have had a much better experience in my final intensive care clinical if I'd though to open up my workshop notes and tried to address the causes of aversions, or use offline time to train habits, or, y'know, do anything on purpose instead of floundering around trying things at random until they worked. 

(The again, I didn't apply concepts like System 1 and System 2 to myself a year ago. I read 'Thinking Fast and Slow' by Kahneman and 'Rationality and the Reflective Mind' by Stanovich as part of my minicamp goal 'read 12 hard nonfiction books this year', most of which came from the CFAR recommended reading list. If my preceptor had had any idea what I was saying when I explained to her that she was running particular nursing skills on System 1, because they were engrained on the level of habit, and I was running the same tasks on System 2 in working memory because they were new and confusing to me, and that was why I appeared to have poor time management, because System 2 takes forever to do anything, this terminology might have helped. Oh, for the world where everyone knows all jargon!)

...And here I am, setting aside a month of my life to think only about rationality. I can't imagine that my counterfactual self-who-didn't-attend-in-May-2012 would be here. I can't imagine that being here now will have zero effect on what I'm doing in a year, or ten years. Bingo. I did one thing deliberately!

So what was the May 2013 workshop actually like?

The curriculum has shifted around a lot in the past year, and I think with 95% probability that it's now more concretely useful. (Speaking of probabilities, the prediction markets during the workshop seemed to flow better and be more fun and interesting this time, although this may just show that I was more averse to games in general and betting in particular. In that case, yay for partly-cured aversions!)

The classes are grouped in an order that allows them to build on each other usefully, and they've been honed by practice into forms that successfully teach skills, instead of just putting words in the air and on flipcharts. For example, having a personal productivity system like GTD came across as a culturally prestigious thing at the last workshop, but there wasn't a lot of useful curriculum on it. Of course, I left on this trip wanting to spend my offline month creating with a GTD system better than paper to-do lists taped to walls, so I have both motivation and a low threshold for improvement. 

There are also some completely new classes, including "Againstness training" by Valentine, which seem to relate to some of the 'reacting under pressure' stuff in interesting ways, and gave me vocabulary and techniques for something I've been doing inefficiently by trial and error for a good part of my life.

In general, there are more classes about emotions, both how to deal with them when they're in the way and how to use them when they're the best tool available. Given that none of us are Spock, I think this is useful. 

Rejection therapy has morphed into a less terrifying and more helpful form with the awesome name of CoZE (Comfort Zone Expansion). I didn't personally find the original rejection therapy all that awful, but some people did, and that problem is largely solved. 

The workshops are vastly more orderly and organized. (I like to think I contributed to this slightly with my volunteer skills of keeping the fridge stocked with water bottles and calling restaurants to confirm orders and make sure food arrived on time.) Classes began and ended on time. The venue stayed tidy. The food was excellent. It was easier to get enough sleep. Etc. The May 2012 venue had a pool, and this one didn't, which made exercise harder for addicts like me. CFAR staff are talking about solving this. 

The workshops still aren't an easy environment for introverts. The negative parts of my experience in May 2012 were mostly because of this. It was easier this time, because as a volunteer I could skip classes if I started to feel socially overloaded, but periods of quiet alone time had to be effortfully carved out of the day, and at an opportunity cost of missing interesting conversations. I'm not sure if this problem is solvable without either making the workshops longer, in order to space the material out, and thus less accessible for people with jobs, or by cutting out curriculum. Either would impose a cost on the extroverts who don't want an hour at lunch to meditate or go running alone or read a sci-fi book, etc. 

In general, I found the May 2012 workshop too short and intense–we had material thrown at us at a rate far exceeding the usual human idea-digestion rate. Keeping in touch via Skype chats with other participants helped. CFAR now does official followups with participants for six weeks following the workshop. 

Meeting the other participants was, as usual, the best part of the weekend. The group was quite diverse, although I was still the only health care professional there. (Whyyy???? The health care system needs more rationality so badly!) The conversations were engaging. Many of the participants seem eager to stay in touch. The May 2012 workshop has a total of six people still on the Skype chats list, which is a 75% attrition rate. CFAR is now working on strategies to help people who want to stay in touch do it successfully. 


I thought the May 2012 workshop was awesome. I thought the May 2013 workshop was about an order of magnitude more awesome. I would say that now is a great time to attend a CFAR workshop...except that the organization is financially stable and likely to still be around in a year and producing even better workshops. So I'm not sure. Then again, rationality skills have compound interest–the value of learning some new skills now, even if they amount more to vocab words and mental labels than superpowers, compounds over the year that you spend seeing all the books you read and all the opportunities you have in that framework. I'm glad I went a year ago instead of this May. I'm even more glad I had the opportunity to see the new classes and meet the new participants a year later. 

New Comment
102 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Classes began and ended on time.

+1 that this made a big difference as a participant. (I was at May 2012 with you, and then March 2013, when they had figured out the timing.)

CFAR is now working on strategies to help people who want to stay in touch do it successfully.

Has there been any talk yet about trying to do Skype chats between graduates of different workshops? It seems like a decent way to magnify the networking effects, as well as making the lists seem healthier (even if you have only 25% from each workshop going on the long-term Skype chat list, that'll still look like a pool of 30 people after 5 workshops instead of 6 people after 1 workshop).


I no longer trust, use, or recommend Skype. There's too much evidence that new owners Microsoft are monitoring conversations. If they aren't handing them over to governments today, it seems like only a matter of time before they do. My security conscious Fortune-500 employer long ago banned Skype for reasons of security.

I would welcome suggestions of more secure alternatives, particularly any that are equally easy to use across platforms and implement good end-to-end client side encryption so snooping on message contents is mathematically infeasible without compromising one end of the communication. That is, no one in the middle should have the keys. If this alternative system also protects end users' locations from snoopers so much the better.


There's too much evidence that new owners Microsoft are monitoring conversations.

And doing what exactly? Many different forms of communications can be monitored, tapped, or recorded. And sometimes you might have to worry about this... but first, you should ask who is doing it, and what they would care about.

If it's Microsoft and/or governments, to be honest I feel relatively safe in most of my conversations, and I don't see why CFAR graduates wouldn't feel the same. People planning terrorist attacks would probably have a different opinion.

There's two instances of snooping that I would worry about, and I don't think they're likely to happen through Skype:

  • Collecting personal data for various mass attacks (notably, spam). But just in case, don't send your credit card information over Skype, that's generally a good habit to have.

  • Information about my personal life being exposed to people I know. I'm having a hard time imagining the mechanism by which even my acquaintances working at Microsoft would end up having access to my Skype conversations.

I also feel somewhat safer having voice conversations than text-based ones, because these are harder to store, and harder to search t... (read more)

If it's Microsoft and/or governments, to be honest I feel relatively safe in most of my conversations, and I don't see why CFAR graduates wouldn't feel the same. People planning terrorist attacks would probably have a different opinion.

Wikileaks came out of an enviroment of smart geeks who wanted to hack the political system.

It's a possibility that you have smart CFAR graduates thinking: "Our politicians are really irrational. I could do clever hack XY and change politics for the better."

But it's not only about protecting yourself, it's also about protecting other people. A bunch of people who use LessWrong do so under nicknames. Some might be interested into doing something that their government doesn't like. They might want to expose political corruption and have a need for their anonymity.

If you tell another person who's real life persona connects to which LessWrong nickname over Skype and that Skype conversation get's monitored by the government you just have given the government information that might help the government to track down the LessWrong person who engages in exposing political corruption.

You have to assume that every word that you communicate witho... (read more)

If your jurisdiction's law enforcement has access to your conversations, there's this issue.
Visiting any https links that you might include in your messages, for a start. I guess, if you don't particularly care whether or not your messages are being read, then it won't matter too much to you. But if you do care - and some people care a lot - then Skype is very much the wrong platform to use. It might be the right platform to use if you particularly want Microsoft to read your message, for some reason.
I'd be fine with switching, but don't have the free energy to find a good replacement.
Goodness knows Microsoft could do with some more rationality, even if they have to come by it illicitly ;) Seriously though: no, don't trust Skype (or Dropbox, or gmail for that matter) to keep your secrets. However, most communications aren't secret, and discussions about rationality per se probably shouldn't be. I can only imagine that someone spying on rationality discussions with sinister intent is doing it for really irrational reasons, so the more they hear and understand, the more the problem solves itself.
I thought Microsoft had plenty of rationality. Their legendary market value reflects their ability to win; which is made more impressive by the fact that they have to overcome some rather severe technical flaws in their products. Admittedly, they seem rather enamored of some forms of Dark Arts - which is one of the reasons why I won't support them - but I can't deny that they are astonishingly successful at what they do.
Microsoft used to win. Their stock has been roughly flat for about the last ten years while the rest of the market has gained tremendously. They are no longer the industry-dominating money machine they used to be in the pre-Ballmer days.
You have a good point. So the winningness would all have been Gates, then.
Or Gates simply chose a good time to leave, or he left because MS was only going to go downhill.
The winningest time to leave.
A quick google (i.e. I have used none of these myself) gives a few suggestions. Off-the-record messaging looks good at first glance, and comes with a plug-in for Pidgin, making it easy to use on any reasonably common operating system and probably possible to use on anything that can run a C compiler. I also ran across chatcrypt and cryptocat, which might be worth investigation.
Cryptocat is an OTR implementation that happens to run as a browser plugin and has developers trying to work out how to have cryptographically secure group conversations. The cross-compatibility should be high.
+1 for OTR. Also, it works with XMPP, which has an A/V extension called Jingle for video calls. I don't know if the two do the Right Thing when put together, but it's worth looking into. I'm not a huge fan of Skype either, but it's extensively used. The requirement of most IM and call networks that the other party be using the same service is incredibly aggravating. Didn't we learn anything from the email walled gardens of the 80s? (or early 90s, I forget)
Jitsi is also relevant to this question, and I will concur that network effects are very frustrating.
I was unaware of Jitsi. At first glance it looks like it does basically what Skype does, but over XMPP and using an open source product. I assume since you brought it up, that you use it. Any impressions you'd care to share, especially regarding multi-user video chats? (this may be relevant to the Less Wrong Study Hall project; one of the options kicked around was an XMPP based system. At the moment we're not pursuing that route, but it's early days yet.)
Video over TOR is equally hard to use across all platforms, and is the only way I know of to protect users' locations without spreading the message as widely as possible. Phone connections are equally insecure and harder to use than Skype. Tinfoil-lined dead drops encrypted with one-time pads seems to be your best bet.

Very good question about health care. I agree completely that we need more rationality in health care. I am very disturbed at the number of physicians who treat medicine as a job and a profession with rules to be followed rather than as a way of thinking and understanding. I really, really would like to find a scientifically minded, rational PCP. (It occurs to me that I do know a bunch of folks at Metamed. I should probably ask them.)

My meta-question for CFAR is what are they doing/planning to bring heavy-duty rationality skills into fields that need them: medicine, education, government, jurisprudence, charity, software development, etc.? Teaching workshops, no matter how life-changing, to 20 people a month doesn't scale.

Second meta-question for CFAR: does it make sense to focus on younger folks at the start of their careers, or even earlier (as SPARC does) so there's a longer compounding payoff over a lifetime or should there be more focus on established professionals, so there's more payoff sooner? or both? If both, do the same workshops, venues, and curriculum make sense for early, mid, and late-career people? E.g. Anna Salamon mentioned that "One person left early fro... (read more)

Written communication has many advantages, but it typically does not make you actually do the exercises. Typically, one just looks briefly at the exercise, thinks "yeah, I see what they are trying to do" and then clicks another hyperlink or switches to another browser tab.

Having five minutes without internet access and with a social pressure to actually do the exercise can make people actually do the exercises they found on internet a decade ago but never tried.

Sure, everyone is different, but I would expect most people who spend a lot of time on internet to be like this. (And the people who don't spend a lot of time on internet won't see LifeHacker or LessWrong, unless a book form is published.)

I see MOOC's as a big educaational improvement because of this - sure, I could get the same educational info without the MOOC structure; just by reading the field best textbooks and academic papers; but having a specific "course" with the quizzes/homework makes me actually do the excercises, which I wouldn't have done otherwise; and the course schedule forces me to do them now, instead of postponing them for weeks/months/forever.

Absolutely true. Some people, perhaps most, don't do the exercises. Also true that some people (myself included) do. Even if only a small percentage of people do the exercises they read about, and only some of the time, that still scales better than in-person classes. On continuing reflection it occurs to me that there's a third scalable technique for increasing rationality, at least in theory: software. I've seen a few attempts to set up tools like the calibration game in software. So far they haven't caught on, but it might be worth exploring further, especially if this can be worked into games the way HpMOR works rationality into really gripping fiction. Thinking back on my life, board games, card games, and D&D type RPGs helped me learn basic probability and game theory without explicitly attempting to do so. I'm not sure today's videogames, fun as they are, have the same virtuous and useful side effects. I wonder if it would be worthwhile to develop a really gripping game that rewarded rational play and probability based reasoning and implicitly taught it.
That's a good point . I am not sure about the numbers today: how many people read LW, how many percent of them would do the exercises, is it more than minicamp participants? But these numbers could be improved by e.g. converting the minicamps into series of online lessons. I guess this is a great opportunity for a CFAR volunteer with video-editing skills.
The recent XCOM game, to some extent meets your criteria (a few bugs aside). Every move matters and must be carefully planned, there are very few actions you can carry out that don't carry a chance of failure. You quickly learn when you can afford to be ambitious, and when you need to have a back up plan if things go wrong. Even better, in Ironman mode your ~30 hour save can easily be more or less ruined in a single turn if you make particularly poor choices (or get very, very unlucky) and you have no save game to resume -- you have to start over from the beginning. My experience talking to other people playing it isn't, however, particularly promising when it comes to implicit teaching. One friend has complained every single time he's missed a 98% chance ("it's bullshit"), even when I pointed out that you make thousands of attacks over the course of a game and should expect to see multiple misses at those odds. If you haven't seen it before, this is an entertaining video series that demonstrates the salient points quickly.
"Doing the exercise" is not something that the student does alone, with the result compared to the answer key in the teacher's edition textbook. To perform the exercises, the student needs other people with enough understanding of the subject to provide short-cycle feedback, and I don't know anyone who can do that for themselves.
This is basically our problem. We could definitely teach the theory of, say, our Installing Habits class remotely, but we spend a lot of it troubleshooting people's actual plans for setting up new habits and giving rapid, personalized feedback, and it'd be quite hard to build that into automated exercises.
On further reflection it occurs to me that research done over the last 50 or so years has also been extremely valuable at increasing rationality. Thanks to the work of Kahneman, Tversky, Ariely, Pearl, Zadeh, and others there are things we know today about cognitive biases and rationality that no one knew in 1963. So original research in certain fields also has the potential to be both scalable and useful.
Workshops scale quite well if you have a specific curriculum and don't rely on the fact that a specific instructor is an awesome person that can't be replaced. You hire more instructors and you can teach more people. CFAR basically has to make business decisions about being able to provide rationality workshops at a price that enough people can afford to pay for CFAR to grow the number of participants. There no reason why CFAR shouldn't be able to grow the number of participants exponentially. How many people follow paradigms such as Yoga, the Landmark Forum or NLP that get taught primarily through in person training and how many people follow proper epistemic rationality? People who promote epistemic rationality were quite bad at spreading it in the last decades compared to how paradigms that get taught in person spread. $3,900 for a three day seminar is probably no price point at which you can teach the workshop to tens of thousands of people. Over time CFAR is probably well advised to streamline their process in a way that makes the workshop cheaper.

There no reason why CFAR shouldn't be able to grow the number of participants exponentially.

I do not concur. CFAR is currently a small organization using small-organization logistics. Expanding to many more instructors would require a management layer different from the implementation layer, and selecting the best implementers to become management has a long history of failure.

One possible solution would be to spin off groups roughly the current size, preferably geographically diverse. That adds more dimensions of complexity but still allows for virtually everybody to be directly involved with the immediate returns of teaching and curriculum development.

At this moment there are already regular LW meetups in different cities around the world. We could find willing instructors in many of them, send them educational materials (one PDF they give to each student, one PDF with the instructions for the teacher), let them teach the lessons and send back the feedback.

The remote teachers and students are already there, and they wouldn't cost CFAR anything. The costs for CFAR at this moment would be: creating the PDF materials from the lessons, and evaluating the feedback.

(I need to think about it some more, and perhaps I will volunteer to make one such example lesson. And publish it on LW, and process the feedback.)

That was my first plan back when things were getting started, but it turned out to be hard to develop instructional materials that worked without a developed professional instructor.

Moving the weight from instructor to material is always a lot of work. A lot of tacit knowledge needs to be made explicit.

These days I am having (as a student) an online lecture about some Java technology. It's 3 days, 8 hours each, we received in total 600 pages of PDF. That is 12 pages per 30 minutes; minus covers and TOC it's 9 pages of useful text.

Years ago I tried to make a non-interactive lesson for high-school students where I just gave them a PDF file with explanations and exercises, and then they worked everyone at their own speed. I needed 8-10 pages for a lesson, and I spent the whole evening just writing what I already perfectly knew. Students liked it, but I gave up doing this because it was too much work for one-time use. However if I had to teach the same thing to many classes (or just the same thing for many years), then it would be less work doing it this way. And the materials can be updated when necessary.

With the rationality exercises it will be even more complicated because we are not even 100% sure about the topic, and there can be more unexpected questions and reactions during the lesson. But I still think it is possible, and that given enough students it may... (read more)

Yes, this is what we first tried before finding out that it was way below the level of working with late-2011-level knowledge and ability to produce lessons. Might be worth retrying once the lessons have been highly polished at the CFAR level.

I wonder if a Kumon-style approach, with lots and lots of small steps and exercises done at one's own pace would be resistant to redesign.
If so, that would be evidence that it is not the best way to implement. The ability to improve a class by redesigning it is a feature of the organization.
How hard is it to create a developed professional instructor? I was under the impression that less than all of CFAR's instructors were primarily educators...
I think quite a few meetups have at least one person that has gone to a workshop. There could be some teaching how to teach at the workshop so that when the go back to the meetups, they can teach there.
As one of the CFAR initiates and a meetup person, the minimum CFAR could do for us is: * A single well-contained booklet of everything with soft copy (they've done this since I went) * Point form script and notes that is used by the CFAR instructors when running the classes. (AFAIK, this is outstanding still) Mostly just to prompt memory. Once you've been through it, it's not hard to remember how it was taught and duplicate it so long as the right prompts are there.
If you do this, I'll run the lesson with the Boston group and give feedback.
If you can do it in Boston, I'd be willing to attend and provide feedback, schedule concerns permitting.
join usssss
Schedule concerns are as follows: I reside on Nantucket and my work schedule for the foreseeable future is Thursday-Monday, with little chance to get a day off before Memorial Day. Monthly recurring is not an option. If you can do a meetup on a Monday night, Tuesday, or Wednesday morning (and crash space is available), I can catch a boat and bus and probably the subway.
Crash space is certainly available for traveling rationalists, but non-weekend meetups are very unlikely.
Growing an organisation such as CFAR isn't easy. It takes some skill. I don't know the CFAR people personally but I have no reason to assume that they are up for that task.
I have met them, although in a limited context, and using that information I have estimated if they were to try to expand, they would be somewhat more likely to do so successfully than the typical group that attempts to expand. I think that my prior is poorly grounded because I have experience with only a few small organizations that tried to expand and failed, and a larger number that tried successfully. However, I didn't know of any of the successful groups before they grew.
As usual it depends on the exponent.
Small-organization (everybody knows everybody) scales to a finite size. Other networking patterns tend to scale better, and I think that cellular organization might work better than hierarchical organization.
True, and I can't see any benefit from hierarchical organisation. There isn't a central authority of rationality any more than there is one for chemistry or calculus. But CFAR maybe hasn't scaled to its maximum size yet, and as it approaches it, it will probably become clearer what the ideal size is, and there will be more people with experience in training who can split off another group.
Unified PR, distribution of some costs (e.g. advertising, website administration), and dispute resolution (e.g. trademark issues) come to mind.
The world is almost entirely controlled by hierarchical organizations ( corporates and governments). Hiercharchal organizations have "won" to a greater extent than pretty much anything else on earth. It's a model with flaws, but it clearly works. A person would need a whole lot of willful blindness to argue with those results. Now as to the question of if those organizations would be good at teaching rationality, that's another question...
Yeah, I can see how hierarchical organisations benefit certain goals and activities. I was speaking specifically about the goal of teaching rationality, in case that wasn't clear from context. You don't need a central authority to control what is being taught so much unless you are teaching irrationality (c.f. Scientology, Roman Catholicism or any political organisation). You could probably run a million rationality courses a year using just a wiki and a smartphone app. (Left as an exercise for the reader)

The workshop spat me out three days later, twice as exhausted, with teetering piles of ideas and very little time or energy to apply them. I left with a list of annual goals, which I had never bothered to have before, and a feeling that more was possible–this included the feeling that more would have been possible if the workshop had been longer and less chaotic, if I had slept more the week before, if I hadn't had to rush out on Sunday evening to catch a plane and miss the social.

How much of the goals have you accomplished and how much of a difference have they actually made?

That answers your first question.
Oh man how did I miss that? Well, that still leaves how important the 9 are.

Oh man how did I miss that?

Could be that age self-experiment you're doing ...

The workshops currently cost $3,900 + travel, I don't think it was much lower a year ago. Have your improvements recouped that cost? Has the workshop increased your income?

I paid about $1000 total for workshop plus travel. The social confidence and "try new things" aspects led me to obtain a scary part-time job at the hospital that brought well over $1000 in income, plus networking and comfort zone expansion. I also started thinking about job options in terms of different salaries and world-changing leverage, which my brain had previously tagged as somehow immoral. This hasn't yet led to me, for example, moving to the USA where nursing salaries are higher or looking for startup opportunities, but it's explicitly on my mind and I've done a few rough value calculations. I expect the idea of "you don't need to do the same thing for 30 years" will lead to quite divergent events in the next 5 years of my life.

(They were in fact $600 + travel).

HI there! Awesome post! Especially the agonizing tradeoff between going now and enjoying the compounding benefits earlier, and going later and getting better material. Obviously, I came down on the side of the first option, but this may not be optimal for everyone. Minor point: 'Rationality and the Reflective Mind' is by Stanovich, not by Kahneman.

(BTW, this is Tarn from the workshop)

2Swimmer963 (Miranda Dixon-Luinenburg)
Fixed the part about Stanovich/Kahneman, thanks!

I laughed at 'back when they were inexplicably called 'minicamps.'' As a member of the first minicamp, which was to be a truncated version of the first Rationality Boot Camp, i find it amusing to watch the memetic evolution into a workshop. Not that workshop is, really, any less arbitrary, just more commonly used for CFAR's sort of thing.

MetaMed is hopefully moving us towards a world with more rationality in the healthcare professions.

The workshops still aren't an easy environment for introverts. The negative parts of my experience in May 2012 were mostly because of this.

In what sense is a personal development seminar to be supposed to feel easy? If someone is really overloaded they can excuse themselves and pause for some time.

It's a question of what you're developing. The four days where you're learning how to use Bayes in everyday life and install new habits and do goal factoring may not be the time when you want to also train not minding social overload. You can do that any time.

I didn't take the workshop so I don't know the exact curriculum. As far as I understand besides teaching people who do use Bayes the workshop also includes exercises to expand peoples comfort zone.
3Swimmer963 (Miranda Dixon-Luinenburg)
The idea is to train these skills separately. CoZE training will be hard for introverts, but this doesn't mean they need to be constantly out of their comfort zone during all of the other classes.
If it does not feel easy, you are probably doing it wrong. Often the hard way is the only way we know, and it is better than nothing. But I believe there is nothing intrinsically hard about personal development or anything. It's just that the easy ways are a very small subset of all ways to do something.
Could you expand on what you've found out about making what is usually considered hard to be not hard?

Example of "hard": Not eating chocolate, when you have chocolate at home.

Example of "easy": Not eating chocolate, when you have no chocolate at home.

Switching from "hard" to "easy" is much easier than using your willpower to win at the "hard" mode. For some reasons many people don't realize that, and instead spend a lot of time talking about it, motivating themselves, inventing various punishment schemes, attending motivation seminars, etc.

I suspect that something similar can be used in many situations. The first aspect is: don't work harder, work smarter. The second aspect is: if it involves some kind of brain power (willpower, memory, creativity), feeling stressed (because you really try to do it the hard way) only makes it more difficult... but for some reason a lot of popular advice recommends increasing the stress (by using rewards and punishments of many kinds). -- I suspect this is the corrupted hardware in action (rewarding and punishing people brings higher status to one who does it).

Some people are afraid that doing things the "easy" way is somehow inferior, probably because it is not mysterious enough. That some... (read more)

Excellent advice. Cheating is just another way of winning.
Not cheating is often a lost purpose.
If I were reading that somewhere else, I would be going to post it in the latest Rationality Quotes thread.
Thanks. I think the problem is at least as much bad software as bad hardware, though. I believe the reason people don't try to figure out easy ways to do things is that they've absorbed an idea that it's more important to prove their virtue by doing hard things than to succeed, and I suspect that idea gets taught by people in authority who'd rather that subordinates not have initiative.
Do you have an easy way to ensure housework is maintained to an acceptable level?
One possible approach is: If you don't need something, throw it away; then you have an "easy" solution for given thing not cluttering your home anymore. A similar but less extreme approach could be to buy a lot of stackable boxes and put everything there, and only take out things that you really need at some moment. After some time the things you didn't need would naturally stay in the boxes. This could solve some of my problems; I don't know if your problems are of this type. Now that I think about it, I would have to do some research about a good system of boxes (big boxes for large items, small boxes for smaller items, and a system to put them all in one place), but the impact on my home could be great. I had a suspicion for a long time that the storage system has a strong impact on how the rest of the house looks, but I didn't spend time researching a good storage system.
If you look at the discourse about deliberate practice it nearly always describe as hard and challenging. On an emotional level, people have ugh fields to protect them from dealing with hard issues in their life. If you do lead people past their ugh fields they have to deal with the hard emotional issue from which the ugh field protected them. If you break through enough ugh fields the emotional processing needs energy. If you try to do as much in a short time frame people will feel overloaded.
There are different kinds of "hard" -- for example hard and fun (beating the final Super Mario level), or hard and painful (finishing the last meters of a marathon run with a broken leg). The former kind of "hard" is great for deliberate practice, but I suppose the introvert socially overloaded during rationality lessons feels like the latter kind of "hard". If the goal of the lesson is dealing with the ugh fields, bringing people to their ugh fields may be useful. (But seems to me that CBT shows that this is better done slowly.) If the goal of the lesson is learning bayesian statistics or similar stuff, bringing people to their unrelated ugh fields is harmful. Challenging the introverted behavior has its place during the "comfort zone expansion" exercises, but is not essential for the remaining lessons.
That depends on what you mean with "better". People who practice CBT are generally paid by the hour and don't have a real issue to spend more time with an issue. That different than someone who wants to produce as much personal change as possible in 3 days. To me that seems like a strange way of doing things. Exercises should reinforce each other instead of being completely distinct.
Sure. But the "we must make big changes in three days" model is itself a choice, which may turn out to be suboptimal for making long-term life changes. As I understand it, it's generally true of skills training that if there are multiple independent aspects to a skill (say, precision and power in a golf swing), the skill improves faster if I train those aspects separately.
Knowing how to excuse oneself and pause for time is a nontrivial skill.
For a seminar like that it's good to have an instructor ask you: "You look overloaded, if it's too much just take a pause." Of course that needs instructors who are percetive enough.
It's tricky to navigate this in the context of a class (which the person won't be able to re-take later, if they do step out). Outside of classes the opportunity cost is less stark, but it's not always easily afforded by the workshop environment / spatial layout. One possible way to address this that's under discussion is adding (for example) an hour right after lunch which is "quiet time" where folks are encouraged to nap, journal, go over notes, exercise, meditate etc.
There nothing stoping the person from taking the workshop again in the future and then not dropping out. If price is the concern, maybe CFAR can give rebates for people who want to take the workshop a second time. It's also a possibility that a lower teacher-student ratio could also CFAR to price the workshop in a way that would allow more people to retake it. I don't think that the case. On a workshop like this chatting with someone between classes can be often more valuable than the specific content covered in a class. The opportunity cost of not engaging into a chat with a fellow workshop participant is harder to estimate beforehand. I don't see a reason why you have to mark that hour as quite time. Some people might prefer to chat others might prefer to be quite. There no reason that everyone has to do the same thing.
Mmm. This seems like an okay plan, but it doesn't hit the root of the problem, which is that the marginal unit of social interaction at the workshop is high value. Someone who did take the hour to journal instead of interact with other participants would probably be making a mistake, even if they're starting to get agitated from too much social interaction. The only ways to make them more introvert-friendly in that sense that I can think of is to make them shorter or longer, neither of which seem like good ideas for economic reasons. Short workshops that occur regularly in one location targeted at locals- basically, the old idea of a rationality dojo- seem like it's worth considering again, but I don't see a way to extend that beyond SF and NY very easily.
I'm rather skeptical of the "rationality dojo" concept because regular dojos are far from a reliable training method. In my experience in the martial arts, I've been taught things that are critically unsafe, things that would be illegal to use in almost any real-world setting, and things that just plain don't work. Finding good dojos is actually a fairly difficult problem. Patterning a new training paradigm after one that fails in the majority of cases seems somewhat dubious to me. Also, the conventional dojo model is, uh, not exactly optimal for introverts.
Are you comparing it to some other training paradigm that succeeds in the majority of cases? If so, do you consider workshops to be such a paradigm, or do you have some other paradigm in mind?
The parts of the dojo model that I'm thinking of importing are: 1. Regular periodic meetings of a few hours, probably weekly or monthly. 2. A geographically local userbase. 3. Clear skill gradations and demarcations and test-based advancement. 4. Regular open training periods. Basically, this seems to manifest as a skill-focused meetup with a bit more structure than normal, and possibly more cash transfers / dues than normal. Do you have warnings about those features, suggestions of other features I should think about importing, or other comments?
(Given that you are at a level where you will cause more good than harm) Decide that you will be a teacher instead of a student and start your own.
So, a vaguely similar plan is in the works here in Austin, but unless it goes spectacularly I don't expect that to happen unless I move to SF or NY.
This has been my current endeavor here in Columbus. It is going better than expected, and has been easier than expected. What worked for us, but is possibly generalizing from one example: First, develop a close-knit group of equals (good for discussion and socialization) . THEN (my personal aha! moment here)... recruit a whole bunch of newbies all at once (good for having organized workshops and classes). You can ask people from the first group to lead various classes for the people in the second group, so that you don't have to do it all yourself. Note: Don't ask people as a group. Ask specific individuals for specific workshops. It is significantly harder to organize workshops and classes among people you think of as your approximate equals in the skill in question, even though every single individual there may prefer it. (This had been our failure mode for a while) (@Vaniver- I know you, so I know that you personally probably already know all this. This comment is more for LWers in general who are thinking about going the "organizer" route.)
Assessing your level is extremely hard in this case (it includes instrumental rationality, epistemic rationality, teaching ability, marketing ability, etc. etc.) and I really suggest that nobody do this without thinking about it very seriously beforehand.

Assessing your level is extremely hard in this case (it includes instrumental rationality, epistemic rationality, teaching ability, marketing ability, etc. etc.) and I really suggest that nobody do this without thinking about it very seriously beforehand.

Oh please no.

Overestimating the value of information, and allowing the perfect to be the enemy of the good are both common failure modes among Less Wrongers. You do not need to "assess your level" down to 16 sig figs (erm, pretend there is a unit of measurement here) along 7 different axes to put yourself on one or the other side of a binary measurement. You just need to ask: "Will listening to me talk about rationality be more likely to help someone, or hurt them?"

And as much as you (generic you, not you specifically) like to believe you are playing around with edgy, dangerous ideas, you are unlikely to cause serious harm to people by teaching a self-help workshop badly. (the people who WOULD be harmed by a badly taught self-help workshop have much worse things to worry about). The cost of failure is not that high. You do not have to have an extremely high level of confidence in your success for an attempt ... (read more)

2Swimmer963 (Miranda Dixon-Luinenburg)
Agreed with this x10!
In general this is good advice. However, I disrecommend it in this specific case. I'm not (that) worried about untrained but enthusiastic amateurs causing harm to other people, though I think this is more of a risk than you imply. I'm worried about untrained but enthusiastic amateurs causing harm to the public image of rationality, to potential future efforts along these lines, etc.

There are two failure modes here. There's failure mode #1, where enthusiastic amateurs teach awful classes and cause some people to think less of 'rationality', and there's failure mode #2 where CFAR graduates want to do cool things and don't do them because they're scared of failure, and a community never materializes. I think #2 is the default, and more likely, and thus worth taking more effort to avoid.

Those seem like very generalizable rationalizations for never actually doing anything. On rationality amateurs causing harm to the public image of rationality- * They can (and DO) do this anyways. On LW, reddit, facebook, blogs, vlogs, etc etc. In fact, I would guess that an enthusiastic amateur could cause more overall harm to the movement on the internet, than running a class irl. * The people who are likely to say EXTREMELY harmful things are extremely unlikely to be the types to decide to lead an organization (require related social skills). * What do you consider to be your worst case scenario? Worst I can come up with is "I taught a terrible workshop! Who would have thought I shouldn't have talked about infanticide to a room full of new mothers? And that one of them made a viral video about it! I won't be able to teach another class until everyone has forgotten it in about two years! (it is unlikely to have a significant effect on cfar or miri)" More realistic: "Wow, that was a terrible and boring class! I bet NONE of the twenty people in the room will come back next week. I will have to find new people now." Neither of these seem worth the level of risk aversion you are recommending here. We are not building an FAI. * I DO recommend placing yourself on the helpful/harmful binary. Obviously, a person who so new and lacking in the relevant skills that they would cause massive harm would be in the "harmful" category. Unfortunate faux pas made by the type of people in the helpful category are unlikely to be at a large scale. Regarding harming "potential future efforts along these lines": * Efforts made by whom? -By myself? Because I suspect future-me will be significantly more skilled at running classes than current-me despite lack of practice? -By CFAR? I have never seen anything from them suggesting that other people (even amateurs) should hold off on creating communities. Quite the contrary.They have invested significant effort and resources in
Yes, and this is a very serious problem that really shouldn't be exacerbated any further at all. I don't agree. Leaders of organizations say outrageous or harmful things all the time, social skills or no social skills. Worst case likely scenario? Rationality becomes karate. There are dozens or hundreds of different people claiming to teach rationality. What they actually teach varies wildly from instructor to instructor. Some groups teach effective skills; some groups teach useless skills; some groups teach actively hazardous skills. In the eyes of the general public, these groups are not distinguishable from one another-- they all provide "rationality training." A newcomer to the field has no idea what groups are good and is not likely to find a good one. Worse, they may not even know that good and bad groups exist, and ultimately gain a degree of confidence unsuited to their skill level. It is dangerous to be half a rationalist, which many learn the hard way. Ultimately, rationality training becomes diluted or confused enough that it more or less possesses no value for the average person.
I think that at least some instructors were selected for things other than ability to observe human behavior, intuit the cause of said behavior, and handle it appropriately.
4Swimmer963 (Miranda Dixon-Luinenburg)
There is a part of the workshop; CoZE training; that is meant to build social skills. If it feels hard, it's working. But if Bayes or Value of Information classes feel hard because participants are exhausted and want to lock themselves in the bathroom alone, that doesn't help with learning the specific skills. Exercise, for example, works well with intense training and rest periods, not 4 days of constant slow jogging.
Understanding Bayes in the abstract is quite easy. The thing that's hard is to use Bayes in your life when you are under emotional pressure. The way it get's taught in the average statistics class doesn't give the participants the ability to use it in their daily life. Teaching it at a moment in which the people are under stress could work well to train them to be able to use it later in their daily life in situations where it matters.
There are still different levels of skills here, e.g. "learn to recognize when Bayes is relevant to a real-life situation" and then "learn to recognize when Bayes is relevant to a real-life situation even when you're under emotional stress." It's probably easier to learn the first rather than skipping directly to the second. (And for what it's worth, the Bayes unit has gone through a large number of iterations and, I have been told, has finally started working. It does indeed emphasize building habits rather than abstract understanding.)