The Fundamental Question

It has been claimed on this site that the fundamental question of rationality is "What do you believe, and why do you believe it?".

A good question it is, but I claim there is another of equal importance. I ask you, Less Wrong...

What are you doing?

And why are you doing it?

277 comments, sorted by
magical algorithm
Highlighting new comments since Today at 9:10 PM
Select new highlight date
Moderation Guidelinesexpand_more

What am I doing?: Working at a regular job as a C++ programmer, and donating as much as possible to SIAI. And sometimes doing other useful things in my spare time.

Why am I doing it?: Because I want to make lots of money to pay for Friendly AI and existential risk research, and programming is what I'm good at.

Why do I want this?: Well, to be honest, the original reason, from several years ago, was "Because Eliezer told me to". Since then I've internalized most of Eliezer's reasons for recommending this, but this process still seems kinda backwards.

I guess the next question is "Why did I originally choose to follow Eliezer?": I started following him back when he still believed in the most basic form of utilitarianism: Maximize pleasure and minimize pain, don't bother keeping track of which entity is experiencing the pleasure or pain. Even back then, Eliezer wasn't certain that this was the value system he really wanted, but for me it seemed to perfectly fit my own values. And even after years of thinking about these topics, I still haven't found any other system that more closely matches what I actually believe. Not even Eliezer's current value system. And yes, I am aware that my value system means that an orgasmium shockwave is the best possible scenario for the future. And I still haven't found any logically consistent reason why I should consider that a bad thing, other than "but other people don't want that". I'm still very conflicted about this.

(off-topic: oh, and SPOILER: I found the "True Ending" to Three Worlds Collide severely disturbing. Destroying a whole planet full of people, just to KEEP the human ability to feel pain??? oh, and some other minor human values, which the superhappies made very clear were merely minor aesthetic preferences. That... really shook my "faith" in Eliezer's values...)

Anyway, the reason why I started following Eliezer was that even back then, he seemed like one of the smartest people on the planet, and he had a mission that I strongly believed in, and he was seriously working towards this mission, with more dedication than I had seen in anyone else. And he was seeking followers, though he made it very clear that he wasn't seeking followers in the traditional sense, but was seeking people to help him with his mission who were capable of thinking for themselves. And at the time I desperately wanted a belief system that was better than the only other belief system I knew of at the time, which was christianity. And so I basically, um... converted directly from christianity to Singularitarianism. (yes, that's deliberate noncapitalization. somehow capitalizing the word "christianity" just feels wrong...)

And now the next question: "Why am I still following Eliezer?": Basically, because I still haven't found anyone to follow who I like better than Eliezer. And I don't dare to try to start my own competing branch of Singularitarianism, staying true to Eliezer's original vision, despite his repeated warnings why this would be a bad idea... Though, um... if anyone else is interested in the idea... please contact me... preferably privately.

Another question is "What other options are worth considering?": Even if I do decide that it would be a good idea to stop following Eliezer, I definitely don't plan to stop being a transhumanist, and whatever I become instead will still be close enough to Singularitarianism that I might as well continue calling it Singularitarianism. And reducing existential risks would still be my main priority. So far the only reasons I know of to stop giving most of my income to SIAI is that maybe their mission to create Friendly AI really is hopeless, and maybe there's something else I should be doing instead. Or maybe I should be splitting my donations between SIAI and someplace else. But where? The Oxford Future of Humanity Institute? The Foresight Institute? The Lifeboat Foundataion? no, definitely not the Venus Project or the Zeitgeist movement. A couple of times I asked SIAI about the idea of splitting my donations with some other group, and of course they said that donating all of the money to them would still be the most leveraged way for me to reduce existential risks. Looking at the list of projects they're currently working on, this does sound plausible, but somehow it still feels like a bad idea to give all of the money I can spare exclusively to SIAI.

Actually, there is one other place I plan to donate to, even if SIAI says that I should donate exclusively to SIAI. Armchair Revolutionary is awesome++. Everyone reading this who has any interest at all in having a positive effect on the future, please check out their website right now, and sign up for the beta. I'm having trouble describing it without triggering a reaction of aversion to cliches, or "this sounds too good to be true", but... ok, I won't worry about sounding cliched: They're harnessing the addictive power of social games, where you earn points, and badges, and stuff, to have a significant, positive impact on the future. They have a system that makes it easy, and possibly fun, to earn points by donating small amounts (99 cents) to one or more of several projects, or by helping in other ways: taking quizzes, doing some simple research, writing an email, making a phone call, uploading artwork, and more. And the system of limiting donations to 99 cents, and limiting it to one donation per person per project, provides a way to not feel guilty about not donating more. Personally, I find this extremely helpful. I can easily afford to donate the full amount to all of these projects, and spend some time on the other things I can do to earn points, and still have plenty of money and time left over to donate to SIAI. Oh, and so far it looks like donating small amounts to a wide variety of projects generates more warm fuzzies than donating large amounts to a single project. I like that.

It would be awesome if SIAI or LW or some of the other existential-risk-reducing groups could become partners of ArmRev, and get their own projects added to the list. Someone get on this ASAP. (What's that you say? Don't say "someone should", say "I will"? Ok, fine, I'll add it to my to-do list, with all of that other stuff that's really important but I don't feel at all qualified to do. But please, I would really appreciate if someone else could help with this, or take charge of this. Preferably someone who's actually in charge at SIAI, or LW, or one of the other groups)

Anyway, there's probably lots more I could write on these topics, but I guess I had better stop writing now. This post is already long enough.

Nothing at all against SIAI but

A couple of times I asked SIAI about the idea of splitting my donations with some other group, and of course they said that donating all of the money to them would still be the most leveraged way for me to reduce existential risks.

If you're in doubt and seeking expert advice you should pick an expert that lacks really obvious institutional incentives to give one answer over others.

Regarding the rest of the comment I found it kind of weird and something freaked me out about it, though I'm not sure quite what. That doesn't mean you're doing anything wrong, I might just have biases or assumptions that make what you're doing seem weird to me. I think it has something to do with your lack of skepticism or cynicism and the focus on looking for someone to follow that MatthewB mentioned. I guess your comment pattern matches with things a very religious person would say: I'm just not sure if that means you're doing something wrong or if I'm having an adverse reaction to a reasonable set of behaviors because I have irrationally averse reactions to things that look religious.

Yeah, I realized that it was silly for me to ask SIAI what they thought about the idea of giving SIAI less money, but I didn't know who else to ask, and I still didn't have enough confidence in my own sanity to try to make this decision on my own. And I was kinda hoping that the people at SIAI were rational enough to give an accurate and reasonably unbiased answer, despite the institutional incentives. SIAI has a very real and very important mission, and I would have hoped that its members would be able to rationally think about what is best for the mission, rather than what is best for the group. And the possibility remains that they did, in fact, give a rational and mostly unbiased answer.

The answer they gave was that donating exclusively to SIAI was the most leveraged way to reduce existential risks. Yes, there are other groups that are doing important work, but SIAI is more critically underfunded than they are, and the projects that we (yes, I said "we", even though I'm "just" a donor) are working on this year are critical for figuring out what the most optimal strategies would be for humanity/transhumanity to maximize its probability of surviving into a post-Singularity future.

heh, one of these projects they're finally getting around to working on this year is writing a research paper examining how much existential-risk-reduction you get for each dollar donated to SIAI. That's something I've really been wanting to know, and had actually been feeling kinda guilty about not making more of an effort to try to figure out on my own, or at least to try to get a vague estimate, to within a few orders of magnitude. And I had also been really annoyed that noone more qualified than me had already done this. But now they're finally working on it. yay :)

Someone from SIAI, please correct me if I'm wrong about any of this.

And yes, my original comment seemed weird to me too, and kinda freaked me out. But I think it would have been a bad idea to deliberately avoid saying it, just because it sounds weird. If what I'm doing is a bad idea, then I need to figure this out, and find what I should be doing instead. And posting comments like this might help with that. Anyway, I realize that my way of thinking sounds weird to most people, and I don't make any claim that this is a healthy way to think, and I'm working on fixing this.

And as I mentioned in another comment, it would just feel wrong to deliberately not say this stuff, just because it sounds weird and might make SIAI look bad. But that kind of thinking belongs to the Dark Arts, and is probably just a bad habit I had left over from christianity, and isn't something that SIAI actually endorses, afaik.

And I do, in fact, have lots of skepticism and cynicism about SIAI, and their mission, and the people involved. This skepticism probably would have caused me to abandon them and their mission long ago... if I would have had somewhere better to go instead, or a more important mission. But after years of looking, I haven't found any cause more important than existential risk reduction, and I haven't found any group working towards this cause more effectively than SIAI, except possibly for some of the other groups I mentioned, but a preliminary analysis shows that they're not actually doing any better than SIAI. And starting my own group still looks like a really silly idea.

And yes, I'm aware that I still seem to talk and think like a religious person. I was raised as a christian, and I took christianity very seriously. Seriously enough to realize that it was no good, and that I needed to get out. And so I tried to replace my religious fanaticism with what's supposed to be an entirely non-religious and non-fanatical cause, but I still tend to think and act both religiously and fanatically. I'm working on that.

I also have an averse reaction to things that look religious. This is one of the many things causing me to have trouble with self-hatred. Anyway, I'm working on that.

Oh, and one more comment about cynicism: I currently think that 1% is an optimistic estimate of the probability that humanity/transhumanity will survive into a positive post-Singularity future, but it's been a while since I reviewed why I believe this. Another thing to add to my to-do list.

somehow it still feels like a bad idea to give all of the money I can spare exclusively to SIAI.

If you were Bill Gates, that might be a valid concern. (The "exclusively" part, not the "SIAI" part.)

Otherwise, it's most efficient to donate to just one cause. Especially if you itemize deductions.

It may just be me, but why do you need to find someone to follow?

I have always found that forging my own path through the wilderness to be far more enjoyable and yield far greater rewards that following a path, no matter how small or large that path may be.

Well, one reason why I feel that I need someone to follow is... severe underconfidence in my ability to make decisions on my own. I'm still working on that. Choosing a person to follow, and then following them, feels a whole lot easier than forging my own path.

I should mention again that I'm not actually "following" Eliezer in the traditional sense. I used his value system to bootstrap my own value system, greatly simplifying the process of recovering from christianity. But now that I've mostly finished with that (or maybe I'm still far from finished?), I am, in fact, starting to think independently. It's taking a long time for me to do this, but I am constantly looking for things that I'm doing or believing just because someone else told me to, and then reconsidering whether these things are a good idea, according to my current values and beliefs. And yes, there are some things I disagree with Eliezer about (the "true ending" to TWC, for example), and things that I disagree with SIAI about ("we're the only place worth donating to", for example). I'll probably start writing more about this, now that I'm starting to get over my irrational fear of posting comments here.

Though part of me is still worried about making SIAI look bad. And I'm still worried that the stuff I've already posted may end up harming SIAI's mission (and my mission) more than it could possibly have helped. Though of course it would be a bad idea to try to hide problems that need to be examined and dealt with. And the idea of deliberately trying to hide information just feels wrong. It feels like Dark Arts. I should also mention that the idea of deliberately not saying things, in order to avoid making the group look bad, isn't actually something I was told by anyone from SIAI, I think it was a bad habit I brought with me from christianity.

And the idea of deliberately trying to hide information just feels wrong. It feels like Dark Arts.

If by 'dark arts' you mean 'non-rational methods of persuasion', such things may be ethically questionable (in general; not volunteering information you aren't obligated to provide almost certainly isn't) but are not (categorically) wrong. Rational agents win.

I like the way steven0461 put it:

...promoting less than maximally accurate beliefs is an act of sabotage. Don’t do it to anyone unless you’d also slash their tires, because they’re Nazis or whatever. Specifically, don’t do it to yourself.

I think I agree with both khafra and Nick.

I like this quote, and I've used it before in conversations with other people.

I think it's worth distinguishing between "underconfidence" and "lack of confidence" - the former implies the latter (although not absolutely), but under some circumstances you are justified in questioning your competence. Either way, it sounds like you're working on both ends of that balance, which is good.

Though part of me is still worried about making SIAI look bad. And I'm still worried that the stuff I've already posted may end up harming SIAI's mission (and my mission) more than it could possibly have helped. Though of course it would be a bad idea to try to hide problems that need to be examined and dealt with. And the idea of deliberately trying to hide information just feels wrong. It feels like Dark Arts. I should also mention that the idea of deliberately not saying things, in order to avoid making the group look bad, isn't actually something I was told by anyone from SIAI, I think it was a bad habit I brought with me from christianity.

I think this is good thinking.

good point about underconfidence versus lack of confidence, thanks

That puts it into an understandable context... I can't quite understand about the having to shake off Christian Beliefs. I was raised with a tremendously religious mother, but about the age of 6 I began to question her beliefs and by 14 was sure that she was stark raving mad to believe what she did. So, I managed to keep from being brainwashed to begin with.

I've seen the results of people who have been brainwashed and who have not managed to break completely free from their old beliefs. Most of them swung back and forth between the extremes of bad belief systems (From born-again Christian to Satanist, and back, many times)... So, what you are doing is probably best for the time being, until you learn the tools needed to step off into the wilderness by yourself.

In my case, I knew pretty much from the beginning that something was seriously wrong. But since every single person I had ever met was a christian (with a couple of exceptions I didn't realize until later), I assumed that the problem was with me. The most obvious problem, at least for me, was that none of the so-called christians was able to clearly explain what a christian is, and what it is that I need to do in order to not go to hell. And the people who came closest to being able to give a clear explanation, they were all different from each other, and the answer changed if I asked different questions. So I guess I was... partly brainwashed. I knew that there was something really important I was supposed to do, and that people's souls were at stake (a matter of infinite utility/anti-utility!) but noone was able to clearly explain what it was that I was supposed to do. But they expected me to do it anyway, and made it sound like there was something wrong with me for not instinctively knowing what it was that I was supposed to do. There's lots more I could complain about, but I guess I had better stop now.

So it was pretty obvious that I wasn't going to be able to save anyone's soul by converting them to christianity by talking to them. And I was also similarly unqualified for most of the other things that christians are supposed to do. But there was still one thing I saw that I could do: living as cheaply as possible, and donating as much money as possible to the church so that the people who claim to actually know what they're doing can just get on with doing it. And just being generally helpful when there was some simple everyday thing I could be helpful with.

Anyway, it wasn't until I went to university that I actually met any atheists who openly admitted to being atheists. Before then, I had heard that there was such a thing as an atheist, and that these were the people whose souls we were supposed to save by converting them to christianity, but Pascal's Wager prevented me from seriously considering becoming an atheist myself. Even if you assign a really tiny probability to christianity being true, converting to atheism seemed like an action with an expected utility of negative infinity. But then I overheard a conversation in the Computer Science students' lounge. That-guy-who-isn't-all-that-smart-but-likes-to-sound-smart-by-quoting-really-smart-people was quoting Eliezer Yudkowsky. Almost immediately after that conversation, I googled the things he was talking about. I discovered Singularitarianism. An atheistic belief system, based entirely on a rational, scientific worldview, to which Pascal's Wager could be applied. (there is an unknown probability that this universe can support an infinite amount of computation, therefore there is an unknown probability that actions can have infinite positive or negative utility.) I immediately realized that I wanted to convert to this belief system. But it took me a few weeks of swinging back and forth before I finally settled on Singularitarianism. And since then I haven't had any desire at all to switch back to christianity. Though I was afraid that, because of my inability to stand up to authority figures, someone might end up convincing me to convert back to christianity against my will. Even now, years later, there are scary situations, when dealing with an authority figure who is a christian, part of me still sometimes thinks "OMG maybe I really was wrong about all this!"

Anyway, I'm still noticing bad habits from christianity that I'm still doing, and I'm still working on fixing this. Also, I might be oversensitive to noticing things that are similar between christianity and Singularitarianism. For example, the expected utility of "converting" someone to Singularitarianism. Though in this case you're not guaranteeing that one soul is saved, you're slightly increasing the probability that everyone gets "saved", because there is now one more person helping the efforts to help us achieve a positive Singularity.

Oh, and now, after reading LW, I realize what's wrong with Pascal's Wager, and even if I found out for certain that this universe isn't capable of supporting an infinite amount of computation, I still wouldn't be tempted to convert back to christianity.

Random trivia: I sometimes have dreams where a demon, or some entirely natural thing that for some reason is trying to look like a demon, is trying to trick or scare me into converting back to christianity. And then I discover that the "demon" was somehow sent by someone I know, and end up not falling for it. I find this amusingly ironic.

As usual, there's lots more I could write about, but I guess I had better stop writing for now.

But it took me a few weeks of swinging back and forth before I finally settled on Singularitarianism.

Here's a quote from an old revision of Wikipedia's entry on The True Believer that may be relevant here:

A core principle in the book is Hoffer's insight that mass movements are interchangeable; he notes fanatical Nazis later becoming fanatical Communists, fanatical Communists later becoming fanatical anti-Communists, and Saul, persecutor of Christians, becoming Paul, a fanatical Christian. For the true believer the substance of the mass movement isn't so important as that he or she is part of that movement.

And from the current revision of the same article:

Hoffer quotes extensively from leaders of the Nazi and communist parties in the early part of the 20th Century, to demonstrate, among other things, that they were competing for adherents from the same pool of people predisposed to support mass movements. Despite the two parties' fierce antagonism, they were more likely to gain recruits from their opposing party than from moderates with no affiliation to either.

Can't recommend this book enough, by the way.

Thanks for the link, and the summary. Somehow I don't find that at all surprising... but I still haven't found any other cause that I consider worth converting to.

At the time I converted, Singularitarianism was nowhere near a mass movement. It consisted almost entirely of the few of us in the SL4 mailing list. But maybe the size of the movement doesn't actually matter.

And it's not "being part of a movement" that I value, it's actually accomplishing something important. There is a difference between a general pool of people who want to be fanatical about a cause, just for the emotional high, and the people who are seriously dedicated to the cause itself, even if the emotions they get from their involvement are mostly negative. This second group is capable of seriously examining their own beliefs, and if they realize that they were wrong, they will change their beliefs. Though as you just explained, the first group is also capable of changing their minds, but only if they have another group to switch to, and they do this mostly for social reasons.

Seriously though, the emotions I had towards christianity were mostly negative. I just didn't fit in with the other christians. Or with anyone else, for that matter. And when I converted to Singularitarianism, I didn't exactly get a warm welcome. And when I converted, I earned the disapproval of all the christians I know. Which is pretty much everyone I have ever met in person. I still have not met any Singularitarian, or even any transhumanist, in person. And I've only met a few atheists. I didn't even have much online interaction with other transhumanists or Singularitarians until very recently. I tried to hang out in the SL4 chatroom a few years ago, but they were openly hostile to the way I treated Singularitarianism as another belief system to convert to, another group to be part of, rather than... whatever it is that they thought they were doing instead. And they didn't seem to have a high opinion of social interaction in general. Or maybe I'm misremembering this.

Anyway, I spent my first approximately 7 years as a Singularitarian in almost complete isolation. I was afraid to request social interaction for the sake of social interaction, because somehow I got the idea that every other Singularitarian was so totally focused on the mission that they didn't have any time at all to spare to help me feel less lonely, and so I should either just put up with the loneliness or deal with it on my own, without bothering any of the other Singularitarians for help. The occasional attempt I made to contact some of the other Singularitarians only further confirmed this theory. I chose the option of just putting up with the loneliness. That may have been a bad decision.

And just a few weeks ago, I found out that I'm "a valued donor", to SIAI. Though I'm still not sure what this means. And I found out that other Singularitarians do, in fact, socialize just for the sake of socializing. And I found out that most of them spend several hours a day "goofing off". And that they spend a significant percentage of their budget on luxuries that technically they could do without, without having a significant effect on their productivity. And that most of them live generally happy, productive, and satisfying lives. And that it was silly of me to feel guilty for every second and every penny that I wasted on anything that wasn't optimally useful for the mission. In addition to the usual reasons why feeling guilty is counterproductive

Anyway, things are finally starting to get better now, and I don't think I'll accomplish anything by complaining more.

Also, most of this was probably my own fault. It turns out that everyone living at the SIAI house was totally unaware of my situation. And this is mostly my fault, because I was deliberately avoiding contacting them, because I was afraid to waste their time. And wasting the time of some one who's trying to save the universe is a big no-no. I was also afraid that if I tried to contact them, then they would ask me to do things that I wasn't actually able to do, but wouldn't know for sure that I wasn't able to do, and would try anyway because I felt like giving up wasn't an option. And it turns out this is exactly what happened. A few months ago I contacted Michael Vassar, and he started giving me things to help with. I made a terrible mess out of trying to arrange the flights for the speakers at the 2009 Singularity Summit. And then I went back to avoiding any contact with SIAI. Until Adelene Dawner talked to them for me, without me asking her to. Thanks Ade :)

Um... one other thing I just realized... well, actually Adelene Dawner just mentioned it in Wave, where I was writing a draft of this post... the reason why I haven't been trying to socialize with people other than Singularitarians is... I was afraid that anyone who isn't a Singularitarian would just write off my fanaticism as general insanity, and therefore any attempt to socialize with non-Singularitarians would just end up making the Singularitarian movement look bad... I already wrote about how this is a bad habit I carried with me from christianity. It's strange that I hadn't actually spent much time thinking about this, I just somehow wrote it off as not an option, to try to socialize with non-Singularitarians, and ended up just not thinking about it after that. I still made a few careful attempts at socializing with non-Singularitarians, but the results of these experiments only confirmed my suspicions.

Oh, and another thing I just realized: Confirmation Bias. These experiments were mostly invalid, because they were set up to detect confirming evidence of my suspicions, but not set up to be able to falsify them. oops. I made the same mistake with my suspicions that normal people wouldn't be able to accept my fanatical Singularitarianism, my suspicions that the other Singularitarians are all so totally focused on the mission that they don't have any time at all for socializing, and also my suspicions that my parents wouldn't be able to accept my atheism. yeah, um, oops. So I guess it would be really silly of me to continue blaming this situation on other people. Yes, it may have been theoretically possible for someone else to notice and fix these problems, but I was deliberately taking actions that ended up preventing them from having a chance to do so.

There's probably more I could say, but I'll stop writing now.

um... after reviewing this comment, I realize that the stuff I wrote here doesn't actually count as evidence that I don't have True Believer Syndrome. Or at least not conclusive evidence.

oh, and did I mention yet that I also seem to have some form of Saviour Complex? Of course I don't actually believe that I'm saving the world through my own actions, but I seem to be assigning at least some probability that my actions may end up making the difference between whether our efforts to achieve a positive Singularity succeed or fail.

but... if I didn't believe this, then I wouldn't bother donating, would I?

Do other people manage to believe that their actions might result in making the difference between whether the world is saved or not, without it becoming a Saviour Complex?

PeerInfinity, I don't know you personally and can't tell whether you have True Believer Syndrome. I'm very sorry for provoking so many painful thoughts... Still. Hoffer claims that the syndrome stems from lack of self-esteem. Judging from what you wrote, I'd advise you to value yourself more for yourself, not only for the faraway goals that you may someday help fulfill.

no need to apologise, and thanks for pointing out this potential problem.

(random trivia: I misread your comment three times, thinking it said "I know you personally can't tell whether you have True Believe Syndrome")

as for the painful thoughts... It was a relief to finally get them written down, and posted, and sanity-checked. I made a couple attempts before to write this stuff down, but it sounded way too angry, and I didn't dare post it. And it turns out that the problem was mostly my fault after all.

oh, and yeah, I am already well aware that I have dangerously low self-esteem. but if I try to ignore these faraway goals, then I have trouble seeing myself as anything more valuable than "just another person". Actually I often have trouble even recognizing that I qualify as a person...

also, an obvious question: are we sure that True Believer Syndrome is a bad thing? or that a Saviour Complex is a bad thing?

random trivia: now that I've been using the City of Lights technique for so long, I have trouble remembering not to use a plural first-person pronoun when I'm talking about introspective stuff... I caught myself doing that again as I checked over this comment.

also, an obvious question: are we sure that True Believer Syndrome is a bad thing? or that a Saviour Complex is a bad thing?

I'm pretty sure of that. Not because of what it does to your goals, but because of what it does to you.

Please forgive my ignorance, or possibly my deliberate forgetfulness, but... can you please remind me what you think it does to me?

Several comments above you wrote that both Christianity and Singularitarianism drained you of the resources you could've spent on having fun. As far as I can understand, neither ideology gave you anything back.

At first I misread what you said and was about to reply with this paragraph:

oh. that's mostly because I was Doing It Wrong. I was pushing myself harder than I could actually sustain in the long term, and that ended up being counterproductive to singularitarianism. ( and also counterproductive to fun, though I still don't consider fun to be of any significant inherent value, compared to the value of the mission)

But then I noticed that when I read your comment, I was automatically adding the words "and this would be bad for the mission", which probably isn't what you meant.

and I might as well admit that as I was thinking about what else to say in reply, everything I thought of was phrased in terms of what mattered to singularitarianism. I was going to resist the suggestion that I should be paying any attention to what the ideology could give back. I was going to resist the suggestion that fun had any use other than helping me stay focused on the mission, if used in moderation.

And I'm still undecided about whether this reaction is a bad thing, because I'm still measuring good and bad according to singularitarian values, not according to selfish values. And I would still resist any attempt to change my values to anything that might conflict with singularitarianism, even in a small way.

ugh... even if everyone from SIAI told me to stop taking this so seriously, I would probably still resist. And I might even consider this as a reason to doubt how seriously they are taking the mission.

ok, so I guess it would be silly of me to claim that I don't have a true believer's complex, or a saviour complex, or just fanaticism in general.

though I still need to taboo the word "fanaticism"... I'm still undecided about whether I'm using it as if it means "so sincerely dedicated that the dedication is counterproductive", or "so sincerely dedicated that anyone who hasn't tried to hack their own mind into being completely selfless would say that I'm taking this way too far".

By the first definition, I would of course consider my fanaticism to be counterproductive and harmful. But I would naturally treat the second definition as an example of other people not taking the mission seriously enough.

And now I'm worrying that all this stuff I'm saying is actually not true, and is really just an attempt to signal how serious and dedicated I am to the mission. Actually, yeah, I would be really surprised if there wasn't any empty signalling going on, and if the signalling wasn't causing my explanations to be inaccurate.

In other news, I'm really tired at the moment, but I'm pushing myself to type this anyway, because it feels really important and urgent.

I think there was more I wanted to say, but whatever it was, I forget it now, and this comment is already long, and I'm tired, so I'll stop writing for now.

also, an obvious question: are we sure that True Believer Syndrome is a bad thing?

Say it was the case that promoting a singularity was a bad idea and that, in particular, SIAI did more harm than good. If someone had compelling evidence of this and presented it to you would you be capable of altering your beliefs and behavior in accordance with this new data? I take it the True Believer would not and that we can all agree with would be a bad thing.