The Wannabe Rational

I have a terrifying confession to make: I believe in God.

This post has three prongs:

First: This is a tad meta for a full post, but do I have a place in this community? The abstract, non-religious aspect of this question can be phrased, "If someone holds a belief that is irrational, should they be fully ousted from the community?" I can see a handful of answers to this question and a few of them are discussed below.

Second: I have nothing to say about the rationality of religious beliefs. What I do want to say is that the rationality of particular irrationals is not something that is completely answered after their irrationality is ousted. They may be underneath the sanity waterline, but there are multiple levels of rationality hell. Some are deeper than others. This part discusses one way to view irrationals in a manner that encourages growth.

Third: Is it possible to make the irrational rational? Is it possible to take those close to the sanity waterline and raise them above? Or, more personally, is there hope for me? I assume there is. What is my responsibility as an aspiring rationalist? Specifically, when the community complains about a belief, how should I respond?

My Place in This Community

So, yeah. I believe in God. I figure my particular beliefs are a little irrelevant at this point. This isn't to say that my beliefs aren't open for discussion, but here and now I think there are better things to discuss.  Namely, whether talking to people like me is within the purpose of LessWrong. Relevant questions have to do with my status and position at LessWrong. The short list:

  1. Should I have kept this to myself? What benefit does an irrational person have for confessing their irrationality? (Is this even possible? Is this post an attempted ploy?) I somewhat expect this post and the ensuing discussion to completely wreck my credibility as a commentator and participant.
  2. Presumably, there is a level of entry to LessWrong that is enforced. Does this level include filtering out certain beliefs and belief systems? Or is the system merit-based via karma and community voting? My karma is well above the level needed to post and my comments generally do better than worse. A merit-based system would prevent me from posting anything about religion or other irrational things, but is there a deeper problem? (More discussion below.) Should LessWrong /kick people who fail at rationality? Who makes the decision? Who draws the sanity water-line?
  3. Being religious, I assume I am far below the desired sanity waterline that the community desires. How did I manage to scrape up over 500 karma? What have I demonstrated that would be good for other people to demonstrate? Have I acted appropriately as a religious person curious about rationality? Is there a problem with the system that lets someone like me get so far?
  4. Where do I go from here? In the future, how should I act? Do I need to change my behavior as a result of this post? I am not calling out for any responses to my beliefs in particular, nor am I calling to other religious people at LessWrong to identify themselves. I am asking the community what they want me to do. Leave? Keep posting? Comment but don't post? Convert? Read everything posted and come back later?


The Wannabe Sanity Waterline

This post has little to do with actual beliefs. I get the feeling that most discussions about the beliefs themselves are not going to be terribly useful. I originally titled this post, "The Religious Rational" but figured the opening line was inflammatory enough and as I began editing I realized that the religious aspect is merely an example of a greater group of irrationals. I could have admitted to chasing UFOs or buying lottery tickets.  What I wanted to talk about is the same.

That being said, I fully accept all criticisms offered about whatever you feel is appropriate. Even if the criticism is just ignoring me or an admin deleting the post and banning me. I am not trying to dodge the subject of my religious beliefs; I provided myself as an example to be convenient and make the conversation more interesting. I have something relevant and useful to discuss in regards to the overall topic of rationalistic communities that applies to the act of spawning rationalists from within fields other than rationalism. Whether it directly applies to LessWrong is for you to decide.

How do you approach someone below the sanity waterline? Do you ignore them and look for people above the line? Do you teach them until they drop their irrational deadweight? How do you know which ones are worth pursuing and which are a complete waste of time? Is there a better answer than generalizing at the waterline and turning away everyone who gets wet? The easiest response to these people is to put the burden of rationality on their shoulders. Let them teach themselves. I think think there is a better way.  I think there are people closer to the waterline than others and deciding to group everyone below the line together makes the job of teaching rationalism harder.

I, for example, can look at my fellow theists and immediately draw up a shortlist of people I consider relatively rationalistic. Compared to the given sanity waterline, all of us are deep underwater due to certain beliefs. But compared to the people on the bottom of the ocean, we're doing great. This leads into the question: "Are there different levels of irrationality?" And also, "Do you approach people differently depending on how far below the waterline they are?"

More discretely, is it useful to make a distinction between two types of theists? Is it possible to create a sanity waterline for the religious? They may be way off on a particular subject but otherwise their basic worldview is consistent and intact. Is there a religious sanity waterline? Are there rational religious? Is a Wannabe Rational a good place to start?

The reason I ask these questions is not to excuse any particular belief while feeling good about everything else in my belief system. If there is a theist struggling to verify all beliefs but those that involve God, then they are no true rationalist. But if said theist really, really wanted to become a rationalist, it makes sense for them to drop the sacred, most treasured beliefs last. Can rationalism work on a smaller scale?

Quoting from Outside the Laboratory (emphasis not mine):

Now what are we to think of a scientist who seems competent inside the laboratory, but who, outside the laboratory, believes in a spirit world? We ask why, and the scientist says something along the lines of: "Well, no one really knows, and I admit that I don't have any evidence - it's a religious belief, it can't be disproven one way or another by observation." I cannot but conclude that this person literally doesn't know why you have to look at things.

A certain difference between myself and this spirit believing scientist is that my beliefs are from a younger time and I have things I would rather do than gallop through that area of the territory checking my accuracy. Namely, I am still trying to discover what the correct map-making tools are.

Also, admittedly, I am unjustifiably attached to that area of my map. It's going to take a while to figure out why I am so attached and what I can do about it. I am not fully convinced that rationalism is the silver-bullet that will solve Life, the Universe, and Everything. I am not letting this new thing near something I hold precious. This is a selfish act and will get in the way of my learning, but that sacrifice is something I am willing to make. Hence the reason I am below the LessWrong waterline. Hence me being a Wannabe Rational.

Instead, what I have done is take my basic worldview and chased down the dogma. Given the set of beliefs I would rather not think about right now, where do they lead? While this is pure anathema to the true rationalist, I am not a true rationalist. I have little idea about what I am doing. I am young in your ways and have much to learn and unlearn. I am not starting at the top of my system; I am starting at the bottom. I consider myself a quasi-rational theist not because I am rational compared to the community of LessWrong. I am a quasi-rational theist because I am rational compared to other theists.

To return to the underlying question: Is this distinction valid? If it is valid, is it useful or self-defeating? As a community, does a distinction between levels of irrationally help or hinder? I think it helps. Obviously, I would like to consider myself more rational than not. I would also like to think that I can slowly adapt and change into something even more rational. Asking you, the community, is a good way to find out if I am merely deluding myself.

There may be a wall that I hit and cannot cross. There may be an upper-bound on my rationalism. Right now, there is a cap due to my theism. Unless that cap is removed, there will likely be a limit to how well I integrate with LessWrong. Until then, rationalism has open season on other areas of my map. It has produced excellent results and, as it gains my trust, its tools gain more and more access to my map. As such, I consider myself below the LessWrong sanity waterline and above the religious sanity waterline. I am a Wannabe Rational.

Why This Helps

The advantage of a distinction between different sanity waterlines is that it allows you to compare individuals within groups of people when scanning for potential rationalists. A particular group may all drop below the waterline but, given their particular irrational map, some of them may be remarkably accurate for being irrational. After accounting for dumb luck, does anyone show a talent for reading territory outside of their too-obviously-irrational-for-excuses belief?

Note that this is completely different than questioning where the waterline is actually drawn. This is talking about people clearly below the line. But an irrational map can have rational areas. The more rational areas in the map, the more evidence there is that some of the mapmaker's tools and tactics are working well. Therefore, this mapmaker is above the sanity waterline for that particular group of irrational mapmakers. In other words, this mapmaker is worth conversing with as long as the conversation doesn't drift into the irrational areas of the map.

This allows you to give people below the waterline an attractive target to hit. Walking up to a theist and telling them they are below the waterline is depressing. They do need to hear it, which is why the waterline exists in the first place, and their level of sanity is too low for them to achieve a particular status. But after the chastising you can tell them that other areas in their map are good enough to become more rational in those areas. They don't need to throw everything away to become a Wanna Rational. They will still be considered irrational but at least their map is more accurate than it was. It is at this point that someone begins their journey to rationalism.

If we have any good reason to help others become more rational, it seems as though this would count toward that goal.


This last bit is short. Taking an example of myself, what should I be doing to make my map more accurate? My process right now is something like this:

  1. Look at the map. What are my beliefs? What areas are marked in the ink of science, evidence, rationalism, and logic? What areas aren't and what ink is being used there?
  2. Look at the territory. Beliefs are great, but which ones are working? I quickly notice that certain inks work better. Why am I not using those inks elsewhere? Some inks work better for certain areas, obviously, but some don't seem to be useful at all.
  3. Find the right ink. Contrasting and comparing the new mapmaking methods with the old ones should produce a clear winner. Keep adding stuff to the toolbox once you find a use for it. Take stuff out of the toolbox when it is replaced by a better, more accurate tool. Inks such as, "My elders said so" and "Well, it sounds right" are significantly less useful. Sometimes we have the right ink but we use incorrectly. Sometimes we find a new way to use an old ink.
  4. Revisit old territory. When I throw out an old ink, examine the areas of the map where that ink was used. Revisit the territory with your new tools handy. Some territory is too hard to access now (beliefs about your childhood) or some areas on your map don't have corresponding territories (beliefs about the gender of God).

These things, in my opinion, are learning the ways of rationality. I have a few areas of my map marked, "Do this part later." I have a few inks labeled, "Favorite colors." These are what keep me below the sanity waterline. As time moves forward I pickup new favorite colors and eventually I will come to the areas saved for later. Maybe then I will rise above the waterline. Maybe then I will be a true rationalist.

296 comments, sorted by
magical algorithm
Highlighting new comments since Today at 8:13 AM
Select new highlight date
Moderation Guidelinesexpand_more

MrHen leaned back in his chair.

It had taken hours to write, but it was flawless. Everything was there: complete deference to the community’s beliefs, politely asking permission to join, admission of guilt. With one post, the tenor of LessWrong had been changed. Religion would join politics and picking up women as forbidden topics.

It would only be later that they would realize what had happened. When rationality became restricted by politeness, that would be when he would begin offering arguments that weakened atheist resolve. And he would have defenders, primed by this pitch-perfect post. Once he was made an honorary member of the “in” group, there is much greater leeway. They had already mentally committed to defend him here, the later details would be immaterial.

After the first online conversion, there would be anger. But at least some would defend him, harkening back to this one post. “It’s okay to be irrational,” they would say, “we’re all irrational about some things.” Oh, the luminaries would never fall. Eliezer, Robin, YVain, Gavin—they were far too strong. But there were those who longed to go back to the warm embrace of belief. Those just emerging from their shells, into the harsh glare of the real. And MrHen, with his equivocating, his rational irrationality—he would lead the way back. Always with the proper respect. A little flattery, a little bowing and scraping, these things go further than one might think in the “rational” world.

Once he was finally banned, and the conversions halted, the citizens of LessWrong would wonder what had driven him. Was it simply his own religious fervor? Or perhaps the old churches to weaken the growing rationalist community from within—perhaps he was in the employ of the Vatican or Salt Lake City, sent to curb a threat. But perhaps it was more sinister still. Perhaps, with his mission complete, MrHen would back to report back to his masters at the ‘chan, on the most epic trolling of all time.

They would never know, not for certain.

Oh man. I had already clicked downvote for excessive paranoia before I read the penultimate sentence. Needless to say, I reversed my judgment immediately.

With one post, the tenor of LessWrong had been changed. Religion would join politics and picking up women as forbidden topics.

I don't understand. This post doesn't suggest that we forbid talking about religion.

It's a half-joke, or a half-sarcastic joke -- I hold such humor in the highest regard and describe it as hyper-cynicism, from my own garbling of this. Basically, the poster is mostly joking, as given away by the last two sentences, but he wouldn't have made the post if he didn't think elements of truth behind it existed.

the poster is mostly joking, as given away by the last two sentences, but he wouldn't have made the post if he didn't think elements of truth behind it existed.

I understand that, but the "joke" was basically that the original post was a conspiracy to make this site more accepting of religion and ban discussion of it, when the post doesn't suggest anything like that; quite the opposite, in fact.

(Brief foreword: You really should read much more of the sequences. In particular How to Actually Change Your Mind, but there are also blog posts on Religion. I hope that one thing that comes out of this discussion is a rapid growth of those links on your wiki info page...)

What are the requirements to be a member of the LessWrong community? If we upvote your comments, then we value them and on average we hope you stay. If we downvote them, we don't value them and we hope either that they improve or you leave. Your karma is pretty positive, so stay.

You seem to be expecting a different shape of answer, about certain criteria you have to meet, about being an aspiring rationalist, or being above the sanity waterline, or some such. Those things will likely correlate with how your comments are received, but you need not reach for such proxies when asking whether you should stay when you have more direct data. From the other side, we need not feel bound by some sort of transparent criteria we propose to set out in order to be seen to be fair in the decisions we make about this; we all make our own judgement calls on what comments we value with the vote buttons.

I think you're led to expect a different sort of answer because you're coming at this from the point of view of what Eliezer calls Traditional Rationality - rationality as a set of social rules. So your question is, am I allowed this belief? If challenged, can I defend it such that those who hear it acknowledge I've met the challenge? Or can I argue that it should not be required to meet these challenges?

This of course is an entirely bogus question. The primary question that should occupy you is whether your beliefs are accurate, and how to make them more accurate. This community should not be about "how can I be seen to be a goodthinking person" but "how can I be less wrong?"

Also, it seems very much as if you already know how things are going to swing when you subject your theistic beliefs to critical examination. That being so, it's hard to know whether you actually believe in God, or just believe that you believe in God. I hope you will decide that more accurate beliefs are better in all areas of study for several reasons, but one is that I doubt that you are maximizing your own happiness. You are currently in a state of limbo on the subject of religion, where you openly admit that you daren't really think about it. I think that you will indeed find the process of really thinking about it painful, but it will be just as painful next year as it will be now, and if you do it now you'll avoid a year of limbo, a year of feeling bad about yourself for not goodthinking, and a year of being mistaken about something very important.

This of course is an entirely bogus question. The primary question that should occupy you is whether your beliefs are accurate, and how to make them more accurate. This community should not be about "how can I be seen to be a goodthinking person" but "how can I be less wrong?"

I like this. This clarifies a lot for me.

This seems along similar lines to my initial reaction: "belief in God" is an undefined statement, since "God" is undefined (or, alternatively, has so many possible definitions that one is left with more noise than signal), and therefore such a statement does not automatically have any particular implications for your level of rationality. Given without any further context (real-world implications, specific definition of "God", etc.) it is more a social tag (statement of identification with the set of people who say they "believe in God") than anything else.

Are there any implications of this belief which affect how you treat other people? Do any of those implications put you at odds with beliefs which are also reasonable if one does not believe in the existence of [your definition of] God?