Wiki Contributions

Comments

Emiya2y230

My working theory is that Putin could be worried about some kind of internal threat to himself and his power.

He's betting a lot on his image of strong, dangerous leader to keep afloat. However the Russian constant propaganda that keeps up that image was starting to be more and more known and ineffective. 

Europe also has been trying to get rid of Russian influence through gas for a while, and would likely have managed in a few more years. Then they'd be free to be less accepting of his anti-human rights antics.

Ukraine joining Nato would have made him look extremely weak, and it would have made it easier to make him look weak in the future.

 

Once his strong image faded he might have been worried of reforming forces within Russia to manage oust him out of office with an actual election and mass wide protests if the ball got rolling enough, or he might be worried about someone taking a more direct approach to eliminate him (he killed enough people to be extremely worried about being murdered, I think).

 

So this is his extreme move to deny weakness. Better to be seen as the tyrant who's willing to do anything if provoked, than the ex-strong leader who can be taken out of office.

Emiya2y10

Indeed, including the people who willingly caused it. But profiting from a problem is not the same as fixing it.

Emiya3y30

Since I wrote my comment I had lots of chances to prod at the apathy of people to act against imminent horrible doom.

I do believe that a large obstacle it's that going "well, maybe I should do something about it, then. Let's actually do that" requires a sudden level of mental effort and responsibility that's... well, it's not quite as unlikely as oxygen turning into gold, but you shouldn't just expect people doing that (it took me a ridiculous amount of time before starting to do so). 

People are going to require a lot of prodding or an environment where taking personal responsibility for a collective crisis is the social norm to get moving. 10 millions would cont as lot of prodding, yeah. 100k... eh, I'd guess lots of people would still jump at that, but not many of those who are paid the same amount or more.

So a calculation like "I can enjoy my life more by doing nothing, lots of other people can try to save the world in my place" might be involved, even if not explicitly. It's a mixture of the Tragedy of the Commons and of Bystander Apathy, two psychological mechanism with plenty of literature.

Emiya3y40

She gave me the answer of someone who had recently stopped liking fritos through an act of will. Her answer went something like this: "Just start noticing how greasy they are, and how the grease gets all over your fingers and coats the inside of the bag. Notice that you don't want to eat things soaked in that much grease. Become repulsed by it, and then you won't like them either."

 

This woman's technique stuck with me. She picked out a very specific property of a thing she wanted to stop enjoying and convinced herself that it repulsed her.

I completely stopped smoking four years ago with the exact same method. It's pretty powerful, I'm definitely making a technique out of this.

Emiya3y10

I think I've been able to make outstanding progresses last year in improving rationality and starting to work on real problems mostly because of megalomaniac beliefs that were somewhat compartmentalised but that I was able to feel at a gut level each time I had to start working.

Lately, as a result of my progresses, I've started slowing down because I was able to come to terms with these megalomaniac beliefs and realise at a gut level they weren't accurate, so a huge chunk of my drive faded, and my predictions about my goals updated on what I felt I could realise with the drive I was feeling, even if I knew that destroying these beliefs was a sign I really improved and was learning how actually hard it is to do world changing stuff...

I'll definitely give this a trial run, trying to chain down those beliefs and pull them out as fuel when I need to.

Emiya3y10

Mh... I guess "holy madman" is a definition too vague to make a rational debate on it? I had interpreted it as "sacrifice everything that won't negatively affect your utility function later on". So the interpretation I imagined would be someone that won't leave himself an inch of comfort more than what's needed to keep the quality of his work constant.

I see slack as leaving yourself enough comfort that you'd be ready to use your free energy in ways you can't see at the moment, so I guess I was automatically assuming an "holy madman" would optimise for outputting the current best effort he can in the long term, rather than sacrificing some current effort to bet on future chances to improve the future output. 

I'd define someone who's leaving this level of slack as someone who's making a serious or full effort, but not an holy madman, but I guess this doesn't means much.

If I were to try to summarise my thoughts on what would happen in reality if someone were to try these options... I think the slack one would work better in general, both by managing to avoid pitfalls and to better exploit your potential for growth.

 

I still feel there's a lot of danger to oneself in trying to take ideas seriously though. If you start trying to act like it's your responsibility to solve a problem that's killing people, the moment you lose your grip on your thoughts it's the moment you cut yourself badly, at least in my experience.

In these days I've managed to reduce the harm that some recurrent thoughts were doing by focusing on distinguish between 1) me legitimately wanting A and planning/acting to achieve A and 2) my worries related to not being able to get A or distress for things currently being not A, telling myself that 2) doesn't helps me get what I want in the least, and that I can still make a full effort for 1), likely a better one, without paying to 2) much attention. 

(I'm afraid I've started to slightly rant from this point. I'm leaving it because I still feel it might be useful)


This strategy worked for my gender transition. 
I'm not sure how I'd react if I were to try telling myself I shouldn't care/feel bad/worry if people die because I'm not managing to fix the problem, even if I KNOW that worrying myself about people dying hinders my effort to fix the problem because feeling sick and worried and tired wouldn't in any way help toward actually working on the problem, I still don't trust my corrupted hardware to not start running some guilt trip against me because I'm trying to be, in a sense that's not utilitarian at all, callous, because I'm trying to not care/feel bad/worry about something like that.


Also, as a personal anecdote of possible pitfalls, trying to take personal responsibility for a global problem had drained my resources in ways I could't foreseen easily. When I got jumped by an unrelated problem about my gender, I found myself without the emotional resources to deal with both stresses at once, so some recurrent thoughts started blaming me because I was letting a personal problem that was in no way as bad as being dead, and didn't blipped at all on any screen in confront to a large number of deaths, screw up with my attempt of working on something that was actually relevant. I realised immediately that this was a stupid thing to think and in no way healthy, but that didn't do much to stop it, and climbing out of that pit of stress and guilt took a while.

In short, my emotional hardware is stupid and bugged and it irritates me to no end how it can just go ahead and ignore my attempts of thinking sanely about stuff.

I'm not sure if I'm just particularly bad at this, or if I just have expectations that are too high. An external view would likely tell me that it's ridiculous for me to expect to be able to go from "lazy and detached" to "saving the world (read reducing X risk), while effortlessly holding at bay emotional problems that would trip most people". I'd surely tell anyone that. On the other hand, it just feels like a stupid thing to not manage doing.

(end of the rant)

 

 (in contrast to me; I'm closer to the standard 40 hours)

Can I ask if you have some sort of external force that makes you do these hours? If not, any advice on how to do that?

I'm coming from a really long tradition of not doing any work whatsoever, and so far I'm struggling to meet my current goal of 24 hours (also because the only deadlines are the ones I manage to give myself... and for reasons I guess I have explained above).

Getting to this was a massive improvement, but again, I feel like I'm exceptionally bad at working hard.

Emiya3y130

I think that the approaches based on being a holy madman greatly underestimates the difficulty on being a value maximiser running on corrupted, basic human hardware.

I'd be extremely skeptical on anyone who claims to have found a way to truly maximise it's utility function, even if they claim to have avoided all the obvious pitfalls of burning out and so-so.

It would be extremely hard to conciliate "put forth your full effort" and staying rational enough to notice you're burning yourself out or noticing that you're getting stuck in some suboptimal route because you're not leaving yourself enough slack to notice better opportunities.

 

The detached academic seems to me an odd way to describe Scott Alexander, who seems to make a really effective effort to spread his values and live his life rationally, for him most of the issues he talks about seem to be pretty practical and relevant, even if he often takes interest on what makes him curious and isn't dropping everything to work on AI - maximise the number of competent people who would work on AI.

 

I'm currently in a now-nine-months-long attempt to move from detached-lazy-academic to make an extraordinary effort

So far every attempt to accurately predict how much of a full effort I can make without getting backlash that makes me worse at it in the next period has failed.

Lots of my plans have failed, so if I had went along with plans that required me to make sacrifices, as taking an idea Seriously would require you to do, would have left me at a serious loss.

What worked most and obtained the most result was keeping a curious attitude toward plans and subjects that are related to my goal, studying to increase my competence in related areas even if I don't see any immediate way it could be of help, and monitoring on how much "weight" I'm putting on the activities that produce the results I need.

I feel I started out being unbelievably bad at working seriously at something, but in nine months I got more results than in a lifetime (in a broad sense, not just related to my goal) and I feel like I went up a couple levels.

I try to avoid going toward any state that resembles a "holy madman" for fear of crashing hard, and I notice that what I'm doing already has me pass as one to even my most informed friends on related subjects, when I don't censor to look normally modest and uninterested. 

 

I might be just at such a low level in the skill of "actually working" that anything that would work great for a functional adult with a good work ethic is deadly to me.

But I'd strongly advise anyone trying the holy madman path to actively pump for as much "anti-holy-madmannes" as they can, since making the full effort to maximise for something seems to me the best way to make sure your ambition burns through any defence your naive, optimistic plans think you have put in place to protect your rationality and your mental health.

 

Cults are bad, becoming a one-man-cult is entirely possible and slightly worse.

Emiya3y30

The review seem pretty balanced and interesting, however the bit about Bailey struck me as really misguided. 

I'll try to explain why, I apologise if at some times I might come off as angry but the whole issue about autogynephilia annoys me both at a personal level as a trans person and at a professional level as a graduated in psychologist and scientist. Alice Dreger seems to have massively botched this part of her work.

In 2006, Dreger decided to investigate the controversy around J. Michael Bailey's book The Man Who Would be Queen. The book is a popularized account of research on transgenderism, including a typology of transsexualism developed by Ray Blanchard. This typology differentiates between homosexual transsexuals, who are very feminine boys who grow up into gay men or straight trans women, and autogynephiles, men who are sexually aroused by imagining themselves as women and become transvestites or lesbian trans women.

Bailey's position is that all transgender people deserve love and respect, and that sexual desire is as good a reason as any to transition. This position is so progressive that it could only cause outrage from self-proclaimed progressives. 

Bailey's position caused outrage in nearly every trans woman who read the book or heard the theory, and in a lot others trans persons who felt delegitimised and misrepresented by the implications. 

If you are transgender, you are suffering from gender dysphoria and you aren't transitioning for sexual reasons at all, though your sexual health would often improve. You are doing what science shows to be the one thing that solves your symptoms that are ruining your life and making you miserable. 

But then, someone who's not trans comes along and says "no, it's really a sex thing" based on a single paper that presented no evidence whatsoever. 

This person, rather than very rigorously trying to test the theory with careful research, which is what everyone, especially someone who's not feeling what trans women are feeling and thus is extremely clueless about the subject because it's really easy to misunderstand a sensation your brain isn't capable of feeling, should do, bases one of the two clusters of the book mostly on a single case study of a trans woman, who has a sex life which isn't representative of the average trans woman at all, but who makes for a very vivid, very peculiar account of sexual practices, and the rest of the "evidence" are just unstructured observations and interviews.

The book doesn't talk at all about how most trans person, men and women and non-binary, discover they are trans, and doesn't describes accurately their internal experience at all. It instead presents all trans women as being motivated by sex, and half of them by sexual tendencies that psychology depicts as pathological.

And then, somehow, this completely unfounded theory becomes one of the most known theories about trans women.

So, if you are a trans woman, best case is, your extremely progressive friends and family come to you and say "oh, we didn't knew it was just a sex thing, you could have told us you had this very weird sexual tendencies rather than make up all of that stuff about how your body and how society's way of treating you like a man makes you feel horribly, it's fine, we understand and love you anyway".

Worst and more common case, your friend, family, work associates and whatever, aren't extremely progressive. They still believe Blanchard's and Bailey's theory about you, though.

And then, when the trans community starts yelling more or less in unison "what the hell?!" at what Bailey wrote in his book, the best response he can come up is saying that the trans women attacking him are in a narcissistic rage because they are narcissists whose pride has been wounded by the truths he wrote, and that they are autogynephiles in denial.

 

Bailey attracted the ire of three prominent transgender activists who proceeded to falsely accuse him of a whole slew of crimes and research ethics violations. The three also threatened and harassed anyone who would defend Bailey; this group included mostly a lot of trans women who were grateful for Bailey's work, and Alice Dreger.

I'm not aware if some transgender women tried to defend the book, but "a lot of transgender women" seem to be a more accurate description for the books detractors than its supporters.

I'm aware of the fact that the three activists mentioned went way too far to be justified in any way. But presenting those as the only critics he received is completely wrong, because there was a huge number of wounded people who saw their lives get worse because of the book.

 

Autogynephilia was made popular as a theory mostly by Bailey's book, and trans exclusionary radical feminist groups, which are currently doing huge damages to trans rights and healthcare, are using it as one of their main arguments to delegitimate trans women and routinely attack trans women with it. Even if Bailey's intentions were good, he failed miserably and produced far more harm than anything else.

Emiya3y10

I'm not 100% sure I understood the first paragraph, could you clarify it for me if I got it wrong?

Essentially, the "efficient-markets-as-high-status-authorities" mindset I was trying to describe seems to me that work as such:

Given a problem A, let's say providing life saving medicine to max number of people, it assumes that letting agents motivated by profit act freely, unrestricted by regulations or policies that even be aimed to try fix problem A, would provide said medicine to more people than an intentional policy of a government that's trying to provide said medicine to max number of people.

The market doesn't seem to have a utility function in this model, but any agent in this market (that is able to survive in it) is motivated by an utility function that just wants to maximise profit.

Part of the reason for the assumption that "free market of agents motivated by profit" should be so good at producing solution for problem A (save lives with medicine) is that the "free market" is awesomely good at pricing actions and at finding ways to get profits, because a lot of agents are trying different things at their best to get profit and everything that works get copied. (If anyone has a roughly related theory and feels I butchered or got wrong the reasoning involved, you are welcomed to express it right, I'm genuinely interested).

 

My main objection to this is that I fail to see how this is different by asking an unaligned AI that's not super intelligent, but still a lot smarter than you, to get your mother out of a burning building so you'd press the reward button the AI wants you to press.

 

If I understood your first paragraph correctly, we are both generally skeptic that a market of agents set about to maximise profit would be, on average in many different possible cases, good at generating value that's different than maximising profit.

 

Thank you for the clarification between unregulated and free. 

I was aware of how one wouldn't lead to the other, but I'm now unsure about how many of the people I talked to about this had this distinction in mind. 
I saw a lots of arguments for deregulation in political press that made appeals to the idea of the "free market", so I think I usually assumed that one arguing for one of these positions would assume that a free market would be an unregulated one and not foresee this obvious problem. 

Load More