This was pleasant to read! You seem to be shifting toward some conservative vibes (in the sense of appreciating the nice things about the past, not in the sense of the Republican party).
One note, to me it feels like there's a bit of tension between doing lots of purely mental exercises, like Hamming questions, and trying to be more "whole". One idea I have is that you become more "whole" by physically doing stuff while having the right kind of focus. But it's a bit tricky to explain what it feels like. I'll try...
For example, when drawing I can easily get into overthinking; but if I draw a quick sketch with my eyes closed, just from visual imagination, it frees me up. Or when playing an instrument, I can easily get into overthinking; but when playing with a metronome, or matching tones with a recording, I get into flow and it feels like improving and relaxing at the same time. Or to take a silly example, I've found that running makes me tense, but skipping (not with a rope, just skipping along the street for a bit) is a happy thing and I feel good afterward. So maybe this feeling that you're looking for isn't a mind thing, but a mind-body connection thing.
(CFAR's website is several years out of date, so please ignore it for now; I'll have it up-to-date-ish in a day or two.)
Hi all! After about five years of hibernation and quietly getting our bearings,[1] CFAR will soon be running two pilot mainline workshops, and may run many more, depending how these go.
We would like now to be called “A Center for Applied Rationality,” not “the Center for Applied Rationality.” Because we’d like to be visibly not trying to be the one canonical locus.
We have two, and are currently accepting applications / sign-ups:
I like this form factor, because:
Like CFAR’s previous workshops, the new workshops are jam-packed with considerably more content than most people expect from 4.5 days.
This includes:
1) Many “CFAR classics,” probably including: Inner Simulator, TAPs, Goal-Factoring, Focusing, Resolve Cycles, CoZE lab, and Hamming Questions. (There's no need to look at this stuff before coming; I’m only linking in case you want to get an idea.)
2) A tone shift (vs the classic workshops) to more of a “rationality hobbyist convention, with visitors from many philosophical schools.” In both our newer (less polished) classes and our remakes of some classics, we’re emphasizing aspects of the human condition that some of us felt were underexplored in the Sequences and in previous CFAR workshops. Notably:
If you want, you’ll get assistance locating the most fundamental moves in your own patterns of thinking, distilling these patterns into a thing you and others can practice consciously (even where they don’t match ours).
(Someone might ask: if there are varied schools of thought present, not all based in the Sequences, what makes it a “rationality” convention? My answer is that it’s a “rationality” convention because we care a lot about forming true beliefs, and about building large-scale models that make coherent, accurate predictions even when taken literally. Some people do talk about “auras” or “reincarnation” in ways that help them describe or fit some local pattern, but at the end of the day these things are not physically literal, and you get bad predictions if you think they are, and we want to keep our eye on that ball while geeking out about the full range of the human condition.)
3) A first two days packed with "content" (mostly classic material, with some new), followed by a (pilot, not yet honed) second half aimed at helping you integrate the skills with one another, with your prior skills, and with your everyday life. Our goal here is to get your CFAR-style/"5-minute-timer-style" skills to coexist with "tortoise skills," with slow patterns of self-observation and of bringing things slowly to consciousness, and with whatever relationships and slow projects you care about.
There will also be nature walks, a chance to chill around a fire pit, or other unhurried time to just hang out.
You might like to come if any of these are true:
These rationality workshops are not for everyone. In particular:
We want the workshop fees to cover the marginal cost to CFAR of running these workshops, and a little bit also of the “standing costs” of running an organization (trying curriculum beforehand on volunteers so we can refine it, etc). We are therefore charging:
If you can’t afford $2k and you believe you’ll bring a lot to the workshop, you’re welcome to apply for financial aid and we’ll see what we can do. Likewise if you really don’t want to put in the amount the sliding scale would demand, and your presence would add substantial value, you’re also welcome to apply for financial aid, and we will consider it.
The above includes room and board. Running and developing CFAR workshops costs us quite a bit; charging at this level should allow us to roughly break even, so we can keep doing this sustainably. I don’t necessarily claim our classes will be worth it to you, although I do think some will get much value from coming. (If you come and, two weeks after returning home, you think your experiences at the workshop haven’t digested into something you find worth it, you can request a refund if you like – CFAR offered this historically, and we intend to keep that part.)
(We are working with an all-very-part-time staff, and plan to keep doing it this way, as I now suspect "doing very-part-time curriculum development and teaching for CFAR" can be healthy, but needs to be mixed with other stuff. (Eliezer said this first, but I didn't believe him.) This decreases total costs some, but it's still expensive.)
Historical-CFAR (2012-2020) ran about sixty four-day (or longer) retreats of various kinds, and did its best to improve them by gradient-descent. We also thought hard, tried things informally in smaller settings, read stuff from others who’d tried stuff, and learned especially from Eliezer’s Sequences/R:AZ.
These latest workshops came from that heritage, plus me having something of an existential crisis in 2020[2] (and reading Hayek, Christopher Alexander, and others, and playing around), and other instructors having their own experiences. We’ve been doing some playing around with these things (different ones of us, in different contexts), but much less so far than on the old stuff – more like CFAR workshops of 2012/2013 in that way.
We here at CFAR believe in goal factoring (sometimes).
If your reason for considering coming to a workshop is that you’d like to boost a “rationality movement” in some form, you might also consider:
If your reason for considering coming is that you’d like a retreat-style break from your daily life, or a chance to reflect, you might also consider:
If your reason is that you’d like to get better at forming true beliefs, or achieving stuff, you might consider:
I think the CFAR retreat is on the pareto frontier for this kind of thing, from my POV. But of course, opinions vary.
One of the healthiest things about Burning Man, IMO, is that at the same time that people are messing around with personal identity and sex and drugs (not necessarily healthy), many of them are also trying to eg repair complicated electronics for art pieces in the middle of the desert without spare parts (healthy; exposes their new mental postures to many “is this working?” checks that’re grounded in the physical world).
At CFAR workshops, people often become conscious of new ways their minds can work, and new things they can try. But we don’t have enough “and now I’ll try to repair my beautiful electronic sculpture, which I need to do right now because the windstorm just blew it all apart, and which will incidentally give me a bunch of real-world grounding” mixed in.
I’d love suggestions here.
There are several features of “humans in human-traditional contexts, who haven’t tried to mess with their functioning with ‘techniques’” that I admire and would love to help people boost (if I knew how, and if people wanted this), or that I’d at least like to avoid eroding much.
Among these:
This one is a general. But practical suggestions for what to ask people about (or what data to otherwise collect) so as to discern how they’re doing, what impact we’re having, etc. are appreciated.
Thanks for reading!
Briefly: I became worried that “strategies like the Democrats’ strategy for how people should sync up informationally” for getting lots of people to sync up were predictably certain kinds of useless, and that there was too much of that in my efforts with CFAR and with recruitment for MIRI. I made an attempt to write about this in Narrative Syncing and in My low-quality thoughts on why CFAR didn’t get farther, although I’m not satisfied with either piece. (I also think fear and urgency helped create tricky dynamics; from my POV I addressed some of this in What should you change in response to an "emergency"? And AI risk.)