Rationality Boot Camp
It’s been over a year since the Singularity Institute launched our ongoing Visiting Fellows Program and we’ve learned a lot in the process of running it. This summer we’re going to try something different. We’re going to run Rationality Boot Camp. We are going to try to take ten weeks and fill them with activities meant to teach mental skills - if there's reading to be done, we'll tell you to get it done in advance. We aren't just aiming to teach skills like betting at the right odds or learning how to take into account others' information, we're going to practice techniques like mindfulness meditation and Rejection Therapy (making requests that you know will be rejected), in order to teach focus, non-attachment, social courage and all the other things that are also needed to produce formidable rationalists. Participants will learn how to draw (so that they can learn how to pay attention to previously unnoticed details, and see that they can do things that previously seemed like mysterious superpowers). We will play games, and switch games every few days, to get used to novelty and practice learning. We're going to run A/B tests on you, and track the results to find out which training activities work best, and begin the tradition of evidence-based rationality training. In short, we're going to start constructing the kind of program that universities would run if they actually wanted to teach you how to think. And then at the end, some of us are going to go to Burning Man for training in desert survival and living in an emotionally positive community. When I call the program Rationality Boot Camp, I mean this quite literally. Six days per week, participants will rise, meditate, prepare and eat food, attend lectures, participate in group and individual activities and exercise together. Everyone who applies needs to have read at least some of the Sequences, and may be assigned particular posts as makeup material - in which case you will need to read them
Thanks for the info PJ!
PCT looks very interesting and your EPIC goal framework strikes me as intuitively plausible. The current list of IGs that we reference is not so much part of CT as an empirical finding from our limited experience building CT charts. Neither Geoff nor I believe that all of them are actually intrinsic. It is entirely possible that we and our subjects are simply insufficiently experienced to penetrate below them. It looks like I've got a lot reading to do :-)