Unreal

Wiki Contributions

Comments

Comment reply: my low-quality thoughts on why CFAR didn't get farther with a "real/efficacious art of rationality"

No, it's definitely not about being depressed. That's very far from it. But I also don't want to argue about the claims here. Seems maybe beside the point.

I think I could reword my original argument in a way that wouldn't be a problem. I just wasn't careful in my languaging, but I personally think it's fine? I think you might be reading a lot into my usage of the word "So". 

Comment reply: my low-quality thoughts on why CFAR didn't get farther with a "real/efficacious art of rationality"

I dunno if I was clear enough here about what it means to feel persecuted. 

So the way I'm using that phrase, 'feeling persecuted' is not desirable whether you are actually being persecuted or not. 

'Feeling persecuted' means feeling helpless, powerless, or otherwise victimized. Feeling like the universe is against you or your tribe, and that things are (in some sense) inherently bad and may forever be bad, and that nothing can be done. 

If, indeed, you are part of a group that has fewer rights and privileges than the dominant groups, you can acknowledge to yourself "my people don't have the same rights as other people" but you don't have to feel any sense of persecution around that. You can just see that it is true and happening, without feeling helpless and like something is inherently broken or that you are inherently broken. 

Seeing through the egregore would help a person realize that 'oh there is an egregore feeding on my beliefs about being persecuted but it's not actually a fundamental truth about the world; things can actually be different; and I'm not defined by my victimhood. maybe i should stop feeding this egregore with these thoughts and feelings that don't actually help anything or anyone and isn't really an accurate representation of reality anyway.' 

Comment reply: my low-quality thoughts on why CFAR didn't get farther with a "real/efficacious art of rationality"

"Learning to run workshops where people often "wake up" and are more conscious/alive/able-to-reflect-and-choose, for at least ~4 days or so and often also for a several-month aftermath to a lesser extent" 

I permanently upgraded my sense of agency as a result of CFAR workshops. Wouldn't be surprised if this happened to others too. Would be surprised if it happened to most CFAR participants. 

//

I think CFAR's effects are pretty difficult to see and measure. I think this is the case for most interventions? 

I feel like the best things CFAR did were more like... fertilizing the soil and creating an environment where lots of plants could start growing. What plants? CFAR didn't need to pre-determine that part. CFAR just needed to create a program, have some infrastructure, put out a particular call into the world, and wait for what shows up as a result of that particular call. And then we showed up. And things happened. And CFAR responded. And more things happened. Etc. 

CFAR can take partial credit for my life starting from 2015 and onwards, into the future. I'm not sure which parts of it. Shrug. 

Maybe I think most people try to slice the cause-effect pie in weird, false ways, and I'm objecting to that here.

Comment reply: my low-quality thoughts on why CFAR didn't get farther with a "real/efficacious art of rationality"

Right. 

I think a careful and non-naive reading of your post would avoid the issues I was trying to address. 

But I think a naive reading of your post might come across as something like, "Oh CFAR was just not that good at stuff I guess" / "These issues seem easy to resolve." 

So I felt it was important to acknowledge the magnitude of the ambition of CFAR and that such projects are actually quite difficult to pull off, especially in the post-modern information age. 

//

I wish I could say I was speaking from an interest in tackling the puzzle. I'm not coming from there. 

Comment reply: my low-quality thoughts on why CFAR didn't get farther with a "real/efficacious art of rationality"

The main ones are: 

  • modern capitalism / the global economy
    • So if we look at the egregore as having a flavor of agency and intention... this egregore demands constant extraction of resources from the earth. It demands people want things it doesn't need (consumer culture). It disempowers or destroys anything that manages to avoid it or escape it (e.g. self-sufficient villages, cultures that don't participate) - there's an extinction of hunter-gatherer lifestyles going on; there's legally mandated taking of children from villages in order to indoctrinate them into civilization (in Malaysia anyway; China is doing a 'nicer' version). There's energy-company goons that go into rainforests and chase out tribes from their homes in order to take their land. This egregore does not care about life or the planet. 
    • You are welcome to disagree of course, this is just one perspective. 
  • I dunno what to call this one, but it's got Marxist roots
    •  There's an egregore that feeds off class division. So right now, there's a bunch of these going on at once. The following are 'crudely defined' and I don't mean them super literally, but just trying to point at some of the dividing lines, as examples: Feminists vs privileged white men. Poor blacks vs white cops. The 99% vs the 1%. Rural vs urban. This egregore wants everyone to feel persecuted. All these different class divisions feed into the same egregore. 
    • Do the rationalists feel persecuted / victimized? Oh yeah. Like, not literally all of them, but I'd say a significant chunk of them. Maybe most of them. So they haven't successfully seen through this one.
  • power-granting religion, broadly construed
    • Christianity is historically the main example of a religious egregore. But a newer contender is 'scientism'. Scientism is not the true art of science and doesn't resemble it at all. Scientism has ordained priests that have special access to journals (knowledge) and special privileges that give them the ability to publish in those esoteric texts. Governments, corporations, and the egregores mentioned above want control over these priests. Sometimes buying their own. 
    • Obviously this egregore doesn't benefit from ordinary people having critical thinking skills and the ability to evaluate the truth for themselves. It dissuades people from trying by creating high barriers to entry and making its texts hard or time-consuming to comprehend. It gets away with a lot of shit by having a strong brand. The integrity behind that brand has significantly degraded, over the decades. 

These three egregores benefit from people feeling powerless, worthless, or apathetic (malware). Basically the opposite of heroic, worthy, and compassionate (liberated, loving sovereignty). Helping to start uninstalling the malware is, like, one of the things CFAR has to do in order to even start having conversations about AI with most people. 

And, unfortunately... like... often, buying into one of these egregores (usually this would be unconsciously done) actually makes a person more effective. Sometimes quite 'successful' according to the egregore's standards (rich, powerful, well-respected, etc). The egregores know how to churn out 'effective' people. But these people are 'effective' in service to the egregore. They're not necessarily effective outside of that context. 

So, any sincere and earnest movement has to contend with this eternal temptation: 

  • Do we sell out? By how much? 

The egregore tempts you with its multitude of resources. To some extent, I think you have to engage. Since you're trying to ultimately change the direction of history, right? 

Still, ahhh, tough. Tough call. Tricky. 

Comment reply: my low-quality thoughts on why CFAR didn't get farther with a "real/efficacious art of rationality"

I probably don't have the kinds of concepts you're interested in, but... 

Some significant conceptual pieces in my opinion are:

  • "As above, so below." Everything that happens in the world can be seen as a direct, fractal-like reflection of 'the mind' that is operating (both individual and collective). Basically, things like 'colonialism' and 'fascism' and all that are external representations of the internal. (So, when some organization is having 'a crisis' of some kind, this is like the Shakespeare play happening on stage... playing out something that's going on internal to the org, both at the group level and the individual level.) Egregores, therefore, are also linked inextricably to 'the mind', broadly construed. They're 'emergent' and not 'fixed'. (So whatever this 'rationality' thing is, could be important in a fundamental way, if it changes 'the mind'.) Circling makes this tangible on a small scale.
  • My teacher gave a talk on "AI" where he lists four kinds of processes (or algorithms, you could say) that all fit onto a spectrum. Artificial Intelligence > Culture > Emotions / Thoughts > Sense perception. Each of these 'algorithms' have 'agendas' or 'functions'. And these functions are not necessarily in service of truth. ('Sense perception' clearly evolved from natural selection, which is keyed into survival and reproduction. Not truth-seeking aims. In other words, it's 'not aligned'.) Humans 'buy in' to these algorithms and deeply believe they're serving our betterment, but 'fitness' (ability to survive and reproduce) is not necessarily the result of 'more truth-aligned' or 'goodness aligned'. So ... a deeper investigation may be needed to discern what's trustworthy. Why do we believe what we believe? Why do we believe the results of AI processes... and then why do we believe in our cultural ideologies? And why do I buy into my thoughts and feelings? Being able to see the nature of all four of these processes and seeing how they're the same phenomena on different scales / using different mediums is useful. 
  • Different people have different 'roles' with respect to the egregores. The obvious role I see is something like 'fundamentalist priest'? Rationality has 'fundamentalist priests' too. They use their religion as a tool for controlling others. "Wow you don't believe X? You must be stupid or insane." To be more charitable though, some people just 'want to move on' from debating things that they've already 'resolved' as 'true'. And so they reify certain doctrines as 'true doctrine' and then create platforms, organizations, and institutions where those doctrines are 'established truth'. From THERE, it becomes much easier to coordinate. And coordination is power. By aligning groups using doctrines, these groups 'get a lot done'. "Getting a lot done" here includes taking stuff over... ideological conquest, among other forms of conquest. This is the pattern that has played out for thousands of years. We have not broken free of this at all, and rationality (maybe moreso EA) has played right into this. And now there's a lot of incentive to maintain and prop up these 'doctrines' because a lot has been built on top of them. 
  • Why do humans keep getting captured? Well we're really easy to manipulate. I think the Sequences covers a lot of this... but also, things like 'fear of death, illness, and loss of livelihood' is a pretty reliable thing humans fall prey to. They readily give away their power when faced with these fears. See: COVID-19. 
  • Because we are afraid of various forms of loss, we desperately build and maintain castles on top of propped up, false doctrines... so yeah, we're scheduling our own collapse. That shit is not gonna hold. Everything we see happening in this world, we ourselves created the conditions for. 
Comment reply: my low-quality thoughts on why CFAR didn't get farther with a "real/efficacious art of rationality"

The hypotheses listed mostly focus on the internal aspects of CFAR.

This may be somewhat misleading to a naive reader. (I am speaking mainly to this hypothetical naive reader, not to Anna, who is non-naive.) 

What CFAR was trying to do was extremely ambitious, and it was very likely going to 'fail' in some way. It's good FOR CFAR to consider what the org could improve on (which is where its leverage is), but for a big picture view of it, you should also think about the overall landscape and circumstances surrounding CFAR.  And some of this was probably not obvious at the outset (at the beginning of its existence), and so CFAR may have had to discover where certain  major roadblocks were, as they tried to drive forward. This post doesn't seem to touch on those roadblocks in particular, maybe because they're not as interesting as considering the potential leverage points. 

But if you're going to be realistic about this and want the big-picture sense, you should consider the following:

  • OK, so CFAR's mission under Pete's leadership was to find/train people who could be effective responders to x-risk, particularly AI risk. 
  • There is the possibility that most of the relevant 'action' on CFAR's part is on 'finding' the right people, with the right starting ingredients, whatever those may be. But maybe there just weren't that many good starting ingredients to be found. That limiting factor, if indeed it was a limiting factor, would have hampered CFAR's ability to succeed in its mission. 
  • Hard problems around this whole thing also include: How do you know what the right starting ingredients even are? What do these 'right people' even look like? Are they going to be very similar to each other or very different? How much is the training supposed to be customized for the individual? What parts of the curriculum should be standardized? 
  • Additional possibility: Maybe the CFAR training wouldn't bear edible fruit for another ten years after that person's initial exposure to CFAR? (And like, I'm leaning on this being somewhat true?) If this is the case, you're just stuck with slow feedback loops. (Additionally, consider the possibility that people who seem to be progressing 'quickly' might be doing this in a misleading way or your criteria for judging are quite wrong, causing you to make changes to your training that lead you astray.) 
  • Less hard problem but adds complexity: How do you deal with the fact that people in this culture, esp rationalists?, get all sensitive around being evaluated? You need to evaluate people, in the end, because you don't have the ability to train everyone who wants it, and not everyone is ready or worth the investment. But then people tend to get all fidgety and triggered when you start putting them in different buckets, especially when you're in a culture that believes strongly in individualism ("I am special, I have something to offer") and equality ("Things should be fair, everyone should have the same opportunities."). And also you're working with people who were socialized from a young age to identify with their own intelligence as a major part of their self-worth, and then they come into your community, feeling like they've finally found their people, only to be told: "Sorry you're not actually cut out for this work. It's not about you."

Also:

  • The egregores that are dominating mainstream culture and the global world situation are not just sitting passively around while people try to train themselves to break free of their deeply ingrained patterns of mind. I think people don't appreciate just how hard it is to uninstall the malware most of us are born with / educated into (and which block people from original thinking). These egregores have been functioning for hundreds of years. Is the ground fertile for the art of rationality? My sense is that the ground is dry and salted, and yet we still make attempts to grow the art out of that soil. 
  • IMO the same effects that have led us to current human-created global crises are the same ones that make it difficult to train people in rationality. So, ya'll are up against a strong and powerful foe. 

Honestly my sense is that CFAR was significantly crippled by one or more of these egregores (partially due to its own cowardice). But that's a longer conversation, and I'm not going to have it out here. 

//

All of this is just to give a taste of how difficult the original problems were that CFAR was trying to resolve. We're not in a world that's like, "Oh yeah, with your hearts and minds in the right place, you'll make it through!" Or even "If you just have the best thoughts compared to all the other people, you'll win!" Or even "If you have the best thoughts, a slick and effective team, lots of money, and a lot of personal agency and ability, you'll definitely find the answers you seek." 

And so the list of hypotheses + analyses above may make it sound like if CFAR had its shit more 'together', it would have done a better job. Maybe? How much better though? Realistically? 

As we move forward on this wild journey, it just seems to become clearer how hard this whole situation really is. The more collective clarity we have on the "actual ground-level situation" (versus internal ideas, hopes, wishes, and fears coloring our perspective of reality), ... honestly the more confronting it all is. The more existentially horrifying. And just touching THAT is hard (impossible?) for most people. 

(Which is partially why I'm training at a place like MAPLE. I seem to be saner now about x-risk. And I get that we're rapidly running out of time without feeling anxious about that fact and without needing to reframe it in a more hopeful way. I don't have much need for hope, it seems. And it doesn't stop me from wanting to help.)

Gracefully correcting uncalibrated shame

Just noting here that Elizabeth wasn't at one of MAPLE's retreats (from what I understand; I'd never set foot on MAPLE at the time of her visit). MAPLE hosts a silent meditation week about once a month. The rest of the weeks are called Responsibility Weeks. While the residents are expected to meditate throughout the day during these Weeks (but it's really hard to because they have to use computers and stuff), guests are not expected to. Guests can just experience a different way of living and being. 

MAPLE has a handful of 'jock hippies'. Jock hippies believe things turn out all right generally. Their visceral experience is embodied. They often experience pleasurable sensations. They're happy despite a lot of turmoil. They like walking barefoot through nature, doing vigorous forms of exercise, and interacting with strangers. 

Elizabeth was on the phone with one such person, who explained things to her in a way that failed to comprehend a more typical rationalist way of experiencing the world. 

But it was good of Elizabeth to come and teach MAPLE something new, and MAPLE is always learning how to better engage with their guests. There are heated debates about this where people get passionate about giving guests a more comfortable experience vs. giving guests a more monastic experience. There is always a tension here, but I do think it's worth MAPLE understanding how to treat different people and know where they're coming from. 

MAPLE's 'demographic' is one of the most diverse (culturally) that I have seen (for something that is super niche and not mainstream or well-funded), and it brings up a lot of complex scenarios. Each different cultural demographic uses language and communication in different ways, and so lots of communication errors are possible. I believe trial and error learning is needed to grow in this area. 

But it would be nice if there were a way to feel more resolution with Elizabeth in particular. I will consider it myself, but, Elizabeth, if you wanted to let me know what would be beneficial for making things right, that would also be helpful. 

Unreal's Shortform

I'm in favor of totally resolving human greed and hatred, but this doesn't seem tractable to me either. (It is literally possible to do it within an individual, through a particular path, but it's not the path most choose.) 

Instead it seems more tractable to create and promote systems and cultures that heavily incentivize moderating greed and hate. 

Unreal's Shortform

Yeah, it seems like ... the rationalization might be sort of a cover-story for certain bad habits or patterns that they don't want to fix in themselves. shrug. I'm not a huge fan. 

Load More