Long. Mostly quite positive, though does spend a little while rolling its eyes at the Eliezer/MIRI connection and the craziness of taking things like cryonics and polyamory seriously.

New Comment
12 comments, sorted by Click to highlight new comments since:

Sweet! We made the New York Times!

The eye-rolling:

People tend to hear about the group from co-workers (usually at tech companies) or through a blog called LessWrong, associated with the artificial-intelligence researcher Eliezer Yudkowsky, who is also the author of the popular fan-fiction novel ‘‘Harry Potter and the Methods of Rationality.’’ (Yudkowsky founded the Machine Intelligence Research Institute (MIRI), which provided the original funding for CFAR; the two groups share an office space in Berkeley.) Yudkowsky is a controversial figure. Mostly self-taught — he left school after eighth grade — he has written openly about polyamory and blogged at length about the threat of a civilization-ending A.I. Despite this, CFAR’s sessions have become popular.

I think this is a fair description and the eye-rolling is really the "despite this" which is comprehensible. More strong is this:

Compulsive and rather grandiose, Yudkowsky is known for proclaiming the imminence of the A.I. apocalypse (‘‘I wouldn’t be surprised if tomorrow was the Final Dawn, the last sunrise before the earth and sun are reshaped into computing elements’’) and his own role as savior (‘‘I think my efforts could spell the difference between life and death for most of humanity’’).

but backs it up with quotes.

Note that these quotes are from 2001 or earlier.

Also, new pet peeve: quotes databases that don't provide sources. This one even tells you how you can cite itself, as if that gives it any authority.

Yup, EY brought enough rope, no reason to spin more.

I read the comments at the NYT-- part of the issue is people pattern-matching to est and part of it was sticker shock at the price. I don't know whether there's anything to be done to the similarity to est (basically that it's a very intense workshop, even though there's no upselling). I'm curious about whether offering it as a series of six-hour one-day workshops would be a bad idea.

There were also a bunch of people who said it wasn't different from CBT or someusch. I think the price is less than a year's worth of therapy, and I wonder how the results compare.

EST-- a human potential system with expensive, intense workshops.

How quickly things change-- EST was very well-known in its time, but you're not the only person I've talked with who'd never heard of it.

Might also be a cultural thing - the Wikipedia articles gives me the impression it was more known in the US than in Europe. There's only one non-English version of the article.

It might be cultural, but the other person who hadn't heard of it is American, and only about 15 years younger than I am.

When I excused myself from one conversation, my interlocutor said, ‘‘I will allow you to disengage,’’ then gave a courtly bow. The only older attendee, a man in his 50s who described himself as polyamorous and ‘‘part Vulcan,’’ ghosted through the workshop, padding silently around the house in shorts and a polo shirt.

This made me laugh out loud.

CFAR should post more stats about success rates of any kind and maintain these stats throughout all cohorts.

Also stats about how many people asked for their money back.