Trans rights! End all suffering!
Apparently the left-leaning stuff I wrote on here got censored and only the shit I now disagree with remains.
What the hell? It's just a more specific version of the point in inadequate equilibria, and don't you want to know if you can do something better?
Presumably the reason why people are roleplaying everything in the first place is because, you'll be seen badly if you stop roleplaying, and being seen badly hurts if you don't have enough emotional resilience. Here's my best attempt at how to break people out of this.
Man, most people are roleplaying everything. It's not fixable by just telling them what concrete stuff they're doing wrong, because they're still running on the algorithm of roleplaying things. Which is why rationality, an attempted account of how to not do stuff wrong, ended in a social club, because it didn't directly address that people are roleplaying everything anyways.
Nice, but the second paper is less on track, as the idea is more "people, society etc. coerce you to do things you don't want" than "long vs short term preferences".
Not something you'll see in papers, but the point of willpower is to limit the amount of time doing stuff you don't want to do. So, your community has some morality that isn't convenient for you? That's why it costs willpower to follow that morality. Your job is tiring? Maybe deep down you don't believe it's serving your interests.
If you have a false belief about what you want, e.g. "I actually want to keep this prestigious position because yay prestige, even though I get tired all the time at work", well, that's a thing a lot of people end up believing, because nobody told them to use "things that make you tired" as a proxy for "things I don't want".
Obviously this has nothing to do with e.g. blood glucose levels.
If you want to spend time predictably spinning in circles in your analysis because you can't bring yourself to believe someone is lying, be my guest.
As for the specific authors: the individual reports written seem fine in themselves, and as for the geoengineering one, I know a guy who did a PhD under the author and said he's generally trustworthy (I recall Vaniver was in his PhD program too). Like what I'm saying is the specific reports, e.g. Bickel's report on geoengineering, seem fine, but Lomborg's synthesis of them is shit, and you're obscuring things with your niceness-and-good-faith approach.
b/c of doing the analysis and then not ranking shit in order.
Further down the list, we find a very controversial project, that is geo-engineering to reduce the intensity of incoming solar radiation to counteract global warming. According to a background paper, such investments would give a return rate of about 1,000. In spite of this enormous return rate, this is given moderate priority, apparently because it is deemed rather uncertain if this will actually work as intended.
> The lowest ranking accepted project, project no. 16, is called "Borehole and public hand pump intervention". This has an estimated benefit-cost-ratio of loess than 3.4.
Next, we come to priority no. 17, the highest ranking not-accepted project. This is "Increased funding for green energy research and development". According to the authors of the background paper, this has benefit-cost-ratios of 10 or more if the time horizon is slightly more than 1 decade. It is therefore a bit strange that this is placed below a project with a clearly less favourable benefit-cost-ratio.
do your own research if you disagree, but if you use "apparently because it is deemed rather uncertain if this will actually work as intended." as an excuse to rate something poorly because you wanted to anyways rather than either do more research and update it, or even just make a guess, then wtf?
We are not playing, "is this plausibly defensible", we are playing, "what was this person's algorithm and are the systematically lying".
Responding to your Dehaene book review and IFS thoughts as well as this:
On Dehaene: I read the 2018 version of Dehaene's Consciousness and the Brain a while ago and would recommend it as a good intro to cognitive neurosci, your summary looks correct.
On meditation: it's been said before, but >90% of people reading this are going to be high on "having models of how their brain works", and low on "having actually sat down and processed their emotions through meditation or IFS or whatevs". Double especially true for all the depressed Berkeley rationalists.
Oh, and fun thing: surely you've heard the idea that "pretty much all effective therapy and meditation and shit is just helping people sit down until they process their emotions instead of running from them like usual". Well, here's IFS being used that way, see from 4:51-5:32.
For the love of the spark, fucking don't. At least separate yourself from the social ladder of EA and learn the real version of rationality first.
Or: ignore that advice, but at least don't do the actual MCB implementation worldwide that costs a billion a year, talk with the scientists who worked on it and figure out the way that MCB could be done most efficiently. And then get things to the point of having a written plan, like, "hey government, here's exactly how you can do MCB if you want, now you can execute this plan as written if/when you choose". Do a test run over a small area, iterate and improve on the technology. B/c governments or big NGOs are more likely to do it if it's fleshed out, e.g. lower risk from their POV.
Thanks! This all sounds right. "CCC has interesting heresies"--was there stuff other than MCB and global poverty? It's an interesting parallel to EA--that they have interesting heresies, but are ultimately wrong about some key assumptions (that there's room for more funding/that MCB is sufficient to stop all climate change, respectively. And they both have a fetish for working within systems rather than trying to change them at all.)
Kinda a shame that leftists are mostly not coming to the "how can we change systems that will undo any progress we make" thing with an effectiveness mindset, though at least these people are.