Minicamps on Rationality and Awesomeness: May 11-13, June 22-24, and July 21-28

I do not say this lightly... but if you're looking for superpowers, this is the place to start.”

--Michael Curzi, summer 2011 minicamp participant

Who: You and a class full of other aspiring rationalists and world-optimizers, from around the world.

What: Two 3-day weekend minicamps and one 8-day minicamp, filled with hands-on activities for applying rationality to your life, your goals, and the making of a better world.  (See details in the FAQ.)

When and where: We're running three camps, so that we can do this for three sets of participants: May 11-13 and June 22-24 for the 3-day camps, and July 21-28 for the eight-day camp, all in the San Francisco Bay Area.

Why: Because you’re a social primate, and the best way to jump into a new way of thinking, make friends, and accomplish your goals is often to spend time with other primates who are doing just that. 

Other reasons:

  • Hang out and explore the Bay Area with two dozen other people like you who are smart, interesting, and passionate about rationality
  • Attend bonus sessions about style, body language, and confidence-building.
  • Get help charting out career paths; and, entirely optionally for those interested, connect with folks at the Singularity Institute about optimal philanthropy.

Instructors:

Eliezer Yudkowsky Anna Salamon Julia Galef
Andrew Critch Luke Muehlhauser Michael Smith

Cost:  $650 for the three-day programs; $1500 for the week-long program.  This includes lodging[1], meals, and tuition.  

(Note that this *still* isn't quite enough to make running minicamps sustainable in the long-run; a lodging + meals at retreat centers start at around $90 per person per night, the "three-day camps" include four nights, and these workshops take a staff of about 5 full-time people for over a month each prior to each workshop, most of us at $3k/month, counting curriculum development time (plus miscellaneous expenses).  We are trying to strike a compromise between "charge enough that we can run more camps" and staying affordable, especially for our start-up phase; costs will probably go up in following years.)

Three days (or a week) isn’t long enough to learn rationality, but it's long enough to learn how to learn rationality, and to get some momentum toward doing so.

Come meet us, and see what you can do.

Apply now.

Frequently Asked Questions:

1.  I’m older.  Should I still apply?

Yes!  We're aiming for a more diverse crowd and would love to add your wider set of experiences and skills.

2.  I’d like to come, but I’m not sure you’ll accept me.  Should I still apply?

Absolutely!   You can fill out our form in as little 10 minutes.  What’s the harm?[2]

3.  I’d like to come, but I can’t afford it.  Should I still apply?

Yes, you should definitely apply.  A limited number of scholarships will probably be available this time, and more may be available later.

(There's also an option on the application form if you want to apply but can't make any of the times - this just says that you want to be part of future minicamps and makes sure we have your application details.)

4.  What will we do, exactly?

We're still working out the details.  In our current model:

  • Daily schedule: Every day, you'll have five hours of core workshop sessions (mostly exercises, divided into morning and evening sessions), meals shared with other participants, and shared activities such as soccer, poker, karaoke, and trips to bay area sites.
  • Rationality: You'll practice many specific techniques (e.g. Fermi calculations, applying Bayes' theorem and cognitive biases to daily life, seeing how using fungibility can boost your goal achievement); develop a map of your rationality strengths and gaps; and learn how to continue learning rationality after the program. 
  • Social effectiveness:  Reading and using body language; developing a fashion sense; improving social courage; and understanding why social reality is important.
  • Individual meetings:  You'll be able to schedule one-on-one appointments to discuss career paths you may want to take (we can help with statistics on earnings in different professions, and strategy for getting in); how to start a LW meet-up or similar community; and, optionally for those interested, how to get involved in existential risks-reducing research and action.

5.  I’m new to all this.  Will it make sense?

If you’ve read at least fifteen posts from the core sequences, yes it will.  If you haven’t: why not read them now?

We’ll also aim for an atmosphere in which everyone is free to make mistakes and to try things, and in which people are receptive to a wide range of skill levels.

6.  I’ve already read the Sequences seventeen times, and also I’m a self-made billionaire with three PhDs.  Will I learn anything new?[3]

We hope so.  We’re covering a good range of material, with much more of a focus on  practice  and  exercise  than in the Sequences, incorporating new lessons learned since the LW material was written, and with some instructors who've developed their own takes on rationality.

7.  What evidence is there that I'll be glad I went?

After last year's minicamp, participants completed an anonymous exit survey.  (With the instructions: "We're asking you these questions to learn how to run camps; please be honest; it'll help us more if you're accurate than if you're positive.")  Here are their answers to the most relevant questions:

  • In answer to “Zero to ten, are you glad you came?”, the median answer was 10 (mean was 9.3).
  • In answer to “Zero to ten, will your life go significantly differently because you came to mini-camp?” the median answer was 7.5 (the mean was 6.9) [This was the response that was most positively surprising to me.].
  • In answer to “Zero to ten, has your epistemic rationality improved?”, the median answer was 7 (mean 6.9).
  • In answer to “Zero to ten, are you more motivated to learn epistemic rationality, than you were when you came?”, the median answer was 8.5 (mean 8.1).
  • In answer to “Zero to ten, have you become more skilled at modifying your emotions and dispositions?”, the median answer was 7 (mean 6.3).
  • In answer to “Zero to ten, are you more motivated to modify your emotions and dispositions, than you were when you came?”, the median answer was 9 (mean 8.3).
  • In answer to “Zero to ten, have you gained social skills since coming?”, the median answer was 7.5 (mean 7.2).
  • In answer to "Zero to ten, did you like spending time with the other participants?", the median answer was 9 (mean 8.8).

We also asked participants for testimonials -- statements designed to be shown to others, in case they wanted to recommend such camps.  They wrote:

“This was an intensely positive experience. This was easily the most powerful change self-modification I've ever made, in all of the social, intellectual, and emotional spheres. I'm now a more powerful person than I was a week ago -- and I can explain exactly how and why this is true.

At mini-camp, I've learned techniques for effective self-modification -- that is, I have a much deeper understanding of how to change my desires, gather my willpower, channel my time and cognitive resources, and model and handle previously confusing situations. What's more, I have a fairly clear map of how to build these skills henceforth, and how to inculcate them in others. And all this was presented in such a way that any sufficiently analytical folk -- anyone who has understood a few of the LW sequences, say -- can gain in extreme measures.”

--Matt Elder / Fiddlemath

“I expected a week of interesting things and some useful tools to take away. What I got was 8 days of constant, deep learning, challenges to my limits that helped me grow. I finally grokked that I can and should optimize myself on every dimension I care about, that practice and reinforcement can make me a better thinker, and that I can change very quickly when I'm not constrained by artificial barriers or stress.

I would not recommend doing something like this right before another super-busy week, because I was learning at 100% of capacity and will need a lot of time to unpack all the things I learned and apply them to my life, but I came away with a clear plan for becoming better. It is now a normal and easy thing for me to try things out, test my beliefs, and self-improve. And I'm likely to be much more effective at making the world a better place as well, by prioritizing without fear.

The material was all soundly-researched and effectively taught, with extremely helpful supplemental exercises and activities. The instructors were very helpful in and out of session. The other participants were excited, engaged, challenging, and supportive.

I look forward to sharing what I've learned with my local Lesswrong meetup and others in the area. If that's even 1/4 as awesome as my time at the Mini-Camp, it will make our lives much better.”

--Ben Hoffman / Benquo

“I really can't recommend this camp enough! This workshop broke down a complex and intertwined set of skills labelled in my brain as "common sense" and distinguished each part so that I could work on them separately. Sessions on motivation, cognition, and what habits to build to not fool yourself were particularly helpful. This camp was also the first example that I've seen of people taking current cognitive science and other research, decoding it, and showing people what's been documented to work so that they can use it too. It feels to me now as though the coolest parts of the sequences have been given specific exercises and habits to build off of. This camp, and the people in it, have changed my path for the better.”

--David Jones / TheDave

You can also read the full testimonials from everyone who chose to give one.

Apply now

(You can totally fill out the application in just 10 minutes, so you might want to fill in the blanks right now -- we'd like to announce the first acceptances (for May) in the next week)


[1] More exactly, we provide a bed in a shared room at a house or retreat center rented by SIAI.

[2] Sometimes people say they’re “afraid of wasting our time” by sending in an application.  In a word, no.  If you’re interested in us, we’re interested in you.  It takes just seconds to read someone’s form, and our experience shows that many of our highest-value people have been the ones who hesitated to apply.

[3] Okay, fine, this isn’t really a frequently asked question.  But seriously, we’ll be covering a lot that isn’t in the sequences -- and the flesh-and-blood experience of meeting other aspiring rationalists is hard to duplicate.

ETA:  CMR is still looking for good teachers and curriculum designers.  If you're interested, please especially consider coming to a minicamp; we're hoping to find some good hires there.

ETA2:  We will probably have answers to all applicants within about two weeks (i.e., by April 16 or so), with answers to the May folks probably earlier than the others.  If for some reason you need your application processed *faster* than this, please shoot me an email: annasalamon at gmail.

239 comments, sorted by
magical algorithm
Highlighting new comments since Today at 4:47 AM
Select new highlight date

I have a high opinion of the minicamp (after observing fiddlemath before and after the camp, anecdotally I'd say he "leveled up" in notable ways that would be worthwhile for me), and I'll probably apply. That being said:

This post gives off bad vibes to (my mental model of) outsiders- I wouldn't be comfortable showing it to a non-LessWrong person and saying "This is what I'll be doing". I'm holding the post to a pretty high standard, because signaling matters a lot for an event where you're asking money from people and putting them through an intensive program (it pattern-matches things that people are wary of, from "multilevel marketing seminar" to "Christian retreat").

Some suggestions:

  • Providing an estimated cost breakdown (from last year) rather than a vague statement ("most of it is meals and lodging") would go a long way toward showing that whatever this is, it's not an SIAI fundraiser.
  • A specific example of an exercise from last summer's minicamps would be much better than a description of how awesome the exercises are in general, both for reassuring people that there's content to it and making people excited (as I was when I heard some of the things you did).
  • A (partial) "program of proposed topics" would make it look substantially more serious: which instructors have which particular focuses?

Like I said, I'm already interested, and I already know that this info exists; but explicitly showing it will vastly improve the signaling value, and remove the inconvenience of having to convince one's friends and family that this isn't an obviously cultish thing.

Here's another random idea:

When I read product or movie reviews, I tend to look for the negatives as much as (if not more than) the positives; I also pay attention to the rating distribution (especially if it's bimodal). If I can't find any negatives, I tend to assume that the product has been astroturfed, and move on.

So, did the SIAI ever receive any negative comments about the rationality minicamp ? If so, where can I read them ?

I posted earlier that the surveys were confidential, but actually, I just reread them, and there doesn't seem to be anything personal in the "Which parts of the camp didn't work particularly well for you, and what do you think we could do to improve?" column, which was basically the "what negative comments do you have?" column. So I pasted those answers into a new spreadsheet and posted them to the web; you can read participants' collected complaints here.

This is very useful, thanks! These don't look too challenging to address - would be good to know more about what you've changed in response to some of the common themes there.

If you or anyone wants to do a survey, I can give you the email addresses of the minicampers, and you can email them and see what you get and post it online (assuming you write in your email that you are looking for publishable comments, etc.). Let me know if you/anyone seriously wish to do this.

Many of the minicampers are on LW, also; the folks with testimonials above have linked LW accounts; but there is of course selection bias there.

Someone else above asked for the negatives as well. Didn't we all submit suggestions for improvement and criticisms last year? Are those publishable? If you don't have permission, you could just email people for permission to publish their criticisms. You can definitely publish any of my comments.

The survey was anonymous, so this is hard to ask permission to share individual comments, since I don't know who wrote them (and folks were assured that their submissions were confidential). You (since you're on the minicamps google group) could email that google group and collect criticisms, and publish them.

Someone else could ask me for folks' email addresses and then do the same.

Anna says we're still looking at locations but it's looking at around $115/person/night just for lodging + meals, and that the 3-day camps actually include 4 nights the way everyone counts things and we have to purchase it. Anna also notes that she and Julia and Michael get $3k/month and this takes way more of their time than just the actual days. So definitely not a Singinst fundraiser. That data is available very easily so I'm posting it right now.

A specific example of an exercise from last year's minicamp that a lot of people liked was "Value of Information" which included the technical details of how to calculate VoI and exercises in being sensitive to particular forms of scope (how much does it cost, how long does it last, how often does it happen).

We're still working out the program which is why it's not posted even tentatively (we were just in the middle of some agonizing about priorities).

$115/person/night

Wait, what? Can I just stay in a hostel and eat from the gorcery store?

Can I just stay in a hostel and eat from the gorcery store?

To make a rational decision, more information is necessary, such as: How much does the hostel cost? Does it have decent beds? Distance from hostel to the place of workshop. Local food costs. Etc. (Don't forget to include facts like if you sleep in a different place than majority, you deprive yourself of opportunity of some morning/evening informal chat.)

Generally: how much would I really save by "hostel & grocery store" and how much would it reduce my gains from the workshop?

Speaking for myself, I would like to cut some costs (together with the cost of flying it makes my salary for 2 months), but there is always a risk. Once I slept in a hostel with beds so bad that I really did not sleep much that night. Now if I imagine 9 such nights plus jet lag, and the resulting effect on my concentration and memory, I would get much less benefit per dollar spent.

Couldn't the SIAI at least provide this option ? Then, people could decide for themselves whether they want cold cereal or gourmet meals.

I went to an Esperanto thing that was set up like this once.

I have friends and relatives who live in the area. How central to the camp is the communal living aspect? What would you charge to commute to it, if that is possible?

I guess we'd charge about 1/2 of the total (noting that you'd still be having meals with the rest of us)... but I suspect commuting is harder than you think, given how intensively scheduled it is. Err on the side of applying, and we can discuss.

Also, if anyone's unable to afford camp for whatever reason, apply anyhow and check the "needs scholarship" box and we can see what can be worked out.

The Bay Area is rather sprawling. It can take 1.5 hours to go from Berkeley to San Jose during rush hour. If they don't live near where the camp is held, I expect you would regret the commute and find the experience more taxing and less relaxing than the participants staying on site.

Agree but... if I knew where in the bay area it's being held I could tell whether it's just around the corner, or a 1.5 hour commute.

It's still being nailed down and it looks like it will be different locations for different camps, but for now it looks like the one week long one is at a retreat center in the deep East Bay between Walnut Creek and Danville, with one weekend camp in Berkeley and one in the South Bay. Still subject to change until we officially announce.

This post gives off bad vibes to (my mental model of) outsiders.

I had the same impression; the post makes the minicamp sound like your average, run-of-the-mill, self-help seminar scam -- complete with testimonials and everything.

That not necessarily a bad thing. Lots of people pay for those. And such people are in need of rationality help!

This plan is so crazy, it just might work ! :-)

Good to know.

I mean, it kind of is a standard workshop (like ones on public speaking, or italian cooking, or, yes, self-help)... except that the content is about Bayesian math, and microeconomics, and the cognitive science of standard human error patterns and so on. And the people you get to network with are other LW-ers who are interested in actually applying this content to practical problems, and coming to embed Bayesian patterns into the details of one's day-to-day thoughts instead of just into the way you answer pen-and-paper questions.

But, yes, similar workshop format, different content. Maybe we should make the ad different too in some way. I wonder if my inclusion of the testimonials and survey data, in particular, may have been misleading -- I was trying to say "look, past participants (who were smart LW-ers a lot like you) liked it, so maybe you will too", but it may have come across as a stronger claim. I'd say come check it out if you're interested, or else wait for a subsequent year if you want to have seen "proof that this will definitively change your life" first or something (which we may or may not ever manage, though we're certainly working on it), and, meanwhile, whether you come or not, do keep contributing on LW, trying exercises yourself in your own life, and generally helping to figure out what rationality can be.

The difficulty stems from how much good stuff is mixed in with all the scams, making outside evaluation much harder. Most self help programs include a lot of the same basic items by necessity (fixing the big problems first). We also seem to understand implicitly that people's self judgement of the effects of these types of things is terrible, especially when those judgements are very close time-wise to the event itself (rationalization of expense and effort, unwillingness to signal disloyalty to new ingroup, even internally).

I attended the 2011 minicamp.

It's been almost a year since I attended. The minicamp has greatly improved me along several dimensions.

  1. I now dress better and have used techniques provided at minicamp to become more relaxed in social situations. I'm more aware of how I'm expressing my body language. It's not perfect control and I've not magically become an extrovert, but I'm better able to interact in random social situations successfully. Concretely: I'm able to sit and stand around people I don't know and feel and present myself as relaxed. I dress better and people have noticed and I've received multiple comments to that effect. I've chosen particular ways to present myself and now I get comments like 'you must play the guitar' (this has happened five times since minicamp haha). This is good since it loads the initial assumptions I want the person to load.

  2. I've intentionally hacked my affectation towards various things to better reach my goals. For years I never wanted to have children. My wife said (earlier this year, after minicamp) that she wanted to have kids. I was surprised and realized that given various beliefs (love for wife, more kids good for society, etc) I needed to bring my emotions and affectations in line with those goals. I did this by maximizing positive exposure to kids and focusing on the good experiences...and it worked. I'm sure nature helped, but I came to a change of emotional reaction that feels very stable. TMI: I had my vasectomy reversed and am actively working on building kid version 1.0

  3. Minicamp helped me develop a better mental language for reasoning around rationalist principles. I've got tools for establishing mental breakpoints (recognizing states of surprise, rationalization, etc) and a sense for how to improve on weak areas in my reasoning. I have a LOT of things I still need to improve. Many of my actions still don't match my beliefs. The up side is that I'm aware of many of the gaps and can make progress toward solving them. There seems to be only so much I can change at once, so I've been prioritizing everything out.

  4. I've used the more concise, direct reasoning around rationality at my job at Valve Software. I use it to help make better decisions, concretely: when making decisions around features to add to DOTA 2 I've worked particularly hard at quickly relinquishing failed ideas that I generated. I have developed litanies like 'my ideas are a product, not a component of my identity.' Before I enter into interactions I pause and think 'what is my goal for this interaction? The reasoning tools from minicamp have helped me better teach and interpret the values of my company (which are very similar). I helped write a new employee guide that captures Valve values, but uses tools such as Anna Salamon's "Litany for Simplified Bayes" to cut straight to the core concepts. "If X is true, what would the world look like?" "If X is not true, what would the world look like?" "What does the world look like?" I've been influential in instituting predictions meetings before we launch new features.

  5. I've been better able to manage my time, because I'm more aware of the biases and pitfalls that lie before me. I think more about what 'BrandonReinhart2020' wants than what the current me wants. (Or at least, my best guess as to what I think he would want...like not being dead, and being a bad ass guitar shredder, etc). This has manifested itself concretely in my self-education around the guitar. When I went to minicamp I had only just started learning guitar. Since then I've practiced 415 hours (I work full time, so this is all in my spare time) and have developed entirely new skills. I can improv, write songs, etc. Minicamp provided some inspiration, yes, but there were also real tools that I've employed. A big one was coming home and doing research on human learning and practice. This helped me realize that my goals were achievable. Luke gave sessions on how to do efficient research. Critch gave a session on hacking your affectations. I used this to make practice something I really, really like doing (I listened to music I liked before practicing, I would put objects like role-playing books or miniatures that I liked around my practice area -- nerdy yes, but it worked for me -- and I would drink a frosty beer after practicing three hours in a row. Okay so that last one shows that my health beliefs and goals may not be entirely in line, but it served an objective here). Now I can easily practice for 3 hours and enjoy every moment of it. (This is important, before I would use that time for World of Warcraft and other pursuits that just wasted time and didn't improve me.)

I've been in the Less Wrong orbit for a long time and have had the goal of improving my rationality for a long time. I've read Yudkowsky's writing since the old SL4 days. I followed Overcoming Bias from the beginning. I can't say that I had a really good grasp on which concepts were the most important until after minicamp. There's huge value in being able to ask questions, debate a point, and just clarify your confusion quickly.

I have also been an SIAI skeptic. Both myself and John Salvatier thought that SIAI might be a little religion-like. Our mistake. The minicamp was a meeting of really smart people who wanted to help each other win more. The minicamp was genuinely about mental and social development and the mastery of concepts that seem to lead to a better ability to navigate complex decision trees toward desired outcomes.

While we did talk about existential risk, the SIAI never went deep into high shock level concepts that might alienate attendees. It wasn't an SIAI funding press. It wasn't a AGI press. In fact, I thought they almost went too light on this subject (but I came to modern rationality from trans/posthumanism and most people in the future will probably get to trans/posthumanism from modern rationality, so discussions about AGI and such feels normal to me). Point being if you have concerns about this you'll feel a lot better as you attend.

I would say the thing that most discomforted me during the event was the attitude toward meditation. I realized, though, that this was an indicator about my preconceptions about meditation and not necessarily due to facts about meditation. After talking to several people about meditation, I learned that there wasn't any funky mysticism inherent to meditation, just closely associated to meditation. Some people are trying to figure out if it can be used as a tool and are trying to figure out ways to experiment around it, etc. I updated away from 'meditation is a scary religious thing' toward 'meditation might be another trick to the bag.' I decided to let other people bear the burden/risk of doing the research there, though. :)

Some other belief shifts related to minicamp: I have greatly updated toward the Less Wrong style rationality process as being legitimate tools for making better decisions. I have updated a great deal toward the SIAI being a net good for humanity. I have updated a great deal toward the SIAI being led by the right group of people (after personal interactions with Luke, Anna, and Eliezer).

Comparing minicamp to a religious retreat seems odd to me. There is something exciting about spending time with a bunch of very smart people, but it's more like the kind of experience you'd have at a domain-specific research summit. The experience isn't to manipulate through repeated and intense appeals to emotion, guilt, etc (I was a Wesleyan Christian when I was younger and went to retreats like Emaeus and I still remember them pressing a nail sharply into my palm as I went to the altar to pray for forgiveness). It's more accurate to think of minicamp as a rationality summit, with the instructors presenting findings, sharing techniques for the replication of those findings, and there being an ongoing open discussion of the findings and the process used to generate findings. And like any good Summit there are parties.

If you're still in doubt, go anyway. I put the probability of self-damage due to attending minicamp at extremely low, compared to self-damage from attending your standard college level economics lecture or a managerial business skills improvement workshop. It doesn't even blip on a radar calibrated to the kind of self-damage you could do speculatively attending religious retreats.

If you're a game developer, you would probably improve your ability to make good decisions around products more by attending SIAI Minicamp than you would by attending GDC (of course, GDC is still valuable for building a social network within the industry).

If you're still in doubt, go anyway. I put the probability of self-damage due to attending minicamp at extremely low, compared to self-damage from attending your standard college level economics lecture or a managerial business skills improvement workshop. It doesn't even blip on a radar calibrated to the kind of self-damage you could do speculatively attending religious retreats.

What about the cost? I would not call spending $1500 in a week insignificant. And as a baseline, I believe that being surrounded for a week by a group of people who believe strongly in some collection of ideas is a risk at least an order of magnitude higher than an economics lecture. I certainly expect that it would have a much stronger effect on me (as it seems it has had on you) than the lecture would, and I would most certainly not take a risk of this magnitude if I have any non-negligible doubts.

To address your second point first, the -attendees- were not a group who strongly shared common beliefs. Some attended due to lots of prior exposure to LW, a very small number were strong x-risk types, several were there only because of recent exposure to things like Harry Potter and were curious, many were strongly skeptical of x-risks. There were no discussions that struck me as cheering for the team -- and I was actively looking for them!

Some counter evidence, though: there was definitely a higher occurrence of cryonicists and people interested in cryonics than you'd find in any random sample of 30 people. I.e.: some amount >2 vs some amount close to 0. So we weren't a wildly heterogeneous group.

As for the instructors - Anna and Luke were both very open about the fact that the rationality-education process is in its infancy and among the various SIAI members there is discussion about how to proceed. I could be wrong, I interpreted Eliezer as being somewhat skeptical of the minicamp process. When he visited, he said he had almost no involvement related to the minicamp. I believe he said he was mainly a sounding board for some of the ideas. I'm interpreting his involvement in this thread now and related threads/topics as a belief shift on his part toward the minicamp being valuable.

I think your order of magnitude increases well describes a bad conceivable scenario, but poorly describes the scenario I actually witnessed.

Now, for cost, I don't know. I'm attending a guitar camp in August that will be 7 days and cost me $2000. I would put the value of minicamp a fair amount above the value of the guitar camp, but I wouldn't necessarily pay $3000 to attend minicamp. To answer the price question I would ask:

1) What else do I plan to spend the $1500 on? What plans or goals suffer setbacks? What would I otherwise buy?

2) What do I value the information from attending at? I can see how it would be easier to measure the value of information from a guitar camp versus one about something that feels more abstract. So maybe the first step is to find the concrete value you've already gotten out of LW. If you've read the sequences and you think there are useful tools there, you might start with 'what would be the estimated value from being able to clarify the things I'm unsure about." So you take some measurement of value you've already gotten from LW and do some back of the napkin math with that.

3) Consider your level of risk aversion versus the value of minicamp now vs later. If these new minicamps are successful, more people will post about them. Attendees will validate or negate past attendee experiences. It may be that if $1500 is too much for you when measured against your estimation of the pay-off discounted by risks, that you simply wait. Either the camps will be shown to be valuable or they will be shown to be low value.

4) Consider some of the broad possible future worlds that follow from attending minicamp. In A you attend and things go great, you come out with new rationality tools. In B you attend and your reaction is neutral and you don't gain anything useful. In C you attend and have poor experiences or worse suffer some kind of self-damage (ex: your beliefs shift in measurably harmful ways that your prior self would have not agreed to submit to ahead of time). Most attendees are suggesting you'll find yourself in worlds like A. We could be lying because we all exist in worlds like C or we're in B but feel an obligation to justify attending the camp or whatever. Weigh your estimate of our veracity with your risk aversion. Update the connected values.

I would suggest it unlikely that the SIAI be so skilled at manipulation that they've succeeded in subverting an entire group of people from diverse backgrounds and with some predisposition to be skeptical. Look for evidence that some people exist in B or C (probably from direct posts stating as much -- people would probably want to prevent other people from being harmed).

There are other things to put into a set of considerations around whether to spend the money, but these are some.

I just wanted to say this (esp. the second part) is actually one of the most cogent posts about anything that I've read in quite some time, and as such, a self-referential example of the value of the camp. It should probably be more visible, and I recommend making it a discussion post about deciding whether/when to attend.

After talking to several people about meditation, I learned that there wasn't any funky mysticism inherent to meditation, just closely associated to meditation. Some people are trying to figure out if it can be used as a tool and are trying to figure out ways to experiment around it, etc.

Rather off-topic, but I'm very interested in rational meditation-advice: Did they suggest specific techniques of meditation like e.g. vipassana or did they recommend some particular books on meditation?

Jasen Murray suggested specific techniques and specific resources, which I unfortunately cannot remember (I was not that interested in that part of RBC).

Thanks for that. It's fascinating to get a glimpse of what rationality looks like in the real world rather than just online interchanges.

Note aside, I'm a big fan of your work. Reassures me to know rationalists are on the team for dota 2

Applied. Looks good. Might decide it's not worth it, but you make a good case.

One thing. 0 to 10 ratings are utterly useless. The median is almost always around 7, for almost anything. Please give us calibrated statistics, not subjective pseudo-quantities where most of the contribution is from noise and offset.

Reminds me of business planning types ranking alternatives 1..n and then treating the indexes as utilities. ick. TYPE ERROR.

We've actually noticed in our weekly sessions that our nice official-looking yes-we're-gathering-data rate-from-1-to-5 feedback forms don't seem to correlate with how much people seem to visibly enjoy the session - mostly the ratings seem pretty constant. (We're still collecting useful data off the verbal comments.) If anyone knows a standard fix for this then PLEASE LET US KNOW.

I'd suggest measuring the Net Promoter Score (NPS) (link). It's used in business as a better measure of customer satisfaction than more traditional measures. See here for evidence, sorry for the not-free link.

  1. "On a scale of 0-10, how likely would you be to recommend the minicamp to a friend or colleague?"
  2. "What is the most important reason for your recommendation?

To interpret, split the responses into 3 groups:

  • 9-10: Promoter - people who will be active advocates.
  • 7-8: Passive - people who are generally positive, but aren't going to do anything about it.
  • 0-6: Detractor - people who are lukewarm (which will turn others off) or will actively advocate against you

NPS = [% who are Promoters] - [% who are Detractors]. Good vs. bad NPS varies by context, but +20-30% is generally very good. The followup question is a good way to identify key strengths and high priority areas to improve.

NPS is a really valuable concept. Means and medians are pretty worthless compared to identifying the percentage in each class, and it's sobering to realize that a 6 is a detractor score.

(Personal anecdote: I went to a movie theater, watched a movie, and near the end, during an intense confrontation between the hero and villain, the film broke. I was patient, but when they sent me an email later asking me the NPS question, I gave it a 6. I mean, it wasn't that bad. Then two free movie tickets came in the mail, with a plea to try them out again.

I hadn't realized it, but I had already put that theater in my "never go again" file, since why give them another chance? I then read The Ultimate Question for unrelated reasons, and had that experience in my mind the whole time.)

Good anecdote. It made me realize that I had just 20 minutes ago made a damning non-recommendation to a friend based off of a single bad experience after a handful of good ones.

Right, I'd forgotten about that. I concur that it is used, and I work in market research sort of.

One idea (which you might be doing already) is making the people collecting the data DIFFERENT from the people organizing/running the sessions.

For example, if Bob organizes and runs a session, and everyone likes Bob, but thinks that the session was so-so, they may be less willing to write negative things down if they know Bob is the one collecting and analyzing data.

If Bob runs the sessions, then SALLY should come in at the end and say something like "Well we want to make these better, so I'M gathering information of ways to improve, etc"

Even if Bob eventually gets the negative information, I think people might be more likely to provide it to Sally (one step removed) than to Bob directly.

(Even better: Nameless Guy organizes a session. Bob teaches session (making sure everyone knows this is NAMELESS' session, and Bob is just the mouthpiece.)

Also, I would say that verbal comments are generally MUCH more useful than Likert scale information anyways. It's better to be getting good comments, and bad Likert scores than vice versa.

Back when I did training for a living, my experience was that those forms were primarily useful for keeping my boss happy. The one question that was sometimes useful was asking people what they enjoyed most and least about the class, and what they would change about it. Even more useful was asking that question of people to their faces. Most useful was testing to determine what they had actually learned, if anything.

I've seen "rate from 1 to 5, with 3 excluded", which should be equivalent to "rate from 1 to 4" but feels substantially different. But there are probably better ones.

In this category of tricks, somebody (I forget who) used a rating scale where you assigned a score of 1, 3, or 9. Which should be equivalent to "rate from 1 to 3", but...