Byrnema, you talk extensively in this post about the LW community having a (dominant) ideology, without ever really explicitly stating what you think this ideology consists of.
I'd be interested to know what, from your perspective are the key aspects of this ideology. I think this would have two benefits:
(More generally, I think this is a great idea.)
that we seem more interested in esoteric situations than in the obvious improvements that would have the biggest impact if adopted on a wide scale.
Overall I think my views are pretty orthodox for LW/OB. But (and this is just my own impression) it seems like the LW/OB community generally considers utilitarian values to be fundamentally rational. My own view is that our goal values are truly subjective, so there isn't a set of objectively rational goal values, although I personally prefer utilitarianism myself.
I have two proposals (which happen to be somewhat contradictory) so I will make them in separate posts.
The second is that many participants here seem to see LW as being about more than helping each other eliminate errors in our thinking. Rather, they see a material probability that LW could become the core of a world-changing rationalist movement. This then motivates a higher degree of participation than would be justified without the prospect of such influence.
To the extent that this (perhaps false) hope may be underlying the motivations of community members, it would be good if we discussed it openly and tried to realistically assess its probability.
Where do you think Less Wrong is most wrong?
That it's not aimed at being "more right" -- which is not at all the same as being less wrong.
To be more right often requires you to first be more wrong. Whether you try something new or try to formulate a model or hypothesis, you must at minimum be prepared for the result to be more wrong at first.
In contrast, you can be "less wrong" just by doing nothing, or by being a critic of those who do something.' But in the real world (and even in science), you can never win BIG -- and it's often hard to win at all -- if you never place any bets.
This is perhaps a useful disctinction:
When it comes to knowledge of the world you want to be more right.
But when it comes to reasoning I do think it is more about being less wrong... there are so many traps you can fall into, and learning how to avoid them is so much of being able to reason effectively.
For a while I tutored middle school students in algebra. Very frequently, I heard things like this from my students:
"I'm terrible at math."
"I hate math class."
"I'm just dumb."
That attitude had to go. All of my students successfully learned algebra; not one of them learned algebra before she came to believe herself good at math. One strategy I used to convince them otherwise was giving out easy homework assignments--very small inferential gaps, no "trick questions".
Now, the "I'm terrible at math" attitude was, in some sense, correct. You could look at their grades and their standardized test scores and see that they were in the lowest quartile of their class. But when my students started seeing A's on their homework papers--when they started to believe that maybe they were good at math, after all--the difference in their confidence and effort was night and day. It was the false belief that enabled them to "take the first steps."
Hypothetical (and I may expand on this in another post):
You've been shot. Fortunately, there's a well-equipped doctor on hand who can remove the bullet and stitch you up. Unfortunately, he's got everything he needs except any kind of pain killer. The only effect of the painkiller is going to be on your (subjective) experience of pain.
He can say: A. Look, I don't have painkiller, but I'm going to have to operate anyhow.
B. He can take some opaque, saline (or otherwise totally inert) IV, tell you it's morphine, and administer it to you.
Which do you prefer he does? Knowing what I know about the placebo effect, I'd have to admit I'd rather be deceived. Is this unwise? Why?
Admittedly, I haven't attained a false conclusion via my epistemology. It's probably wise to generally trust doctors when they tell you what they're administering. So it seems possible to want to have false belief, even while wanting to maintain efficient epistemology. This might not generalize to Pjeby's various theories, but it seems that we can think of at least one case where we would desire having a false belief. Admittedly, this might not be a decision we could make, i.e. "Lie to me about what's in that IV!" might not help. (Though there is some evidence of placebos working even when people were made fully aware they were placebos.)
On the other hand, I'm not sure I can think of an example of where we desire to have a belief that we know to be false, which may be the real issue.
The word "ideology" sounds wrong. One of the aspects of x-rationality is hoarding general correct-ideas-recognition power, as opposed to autonomously adhering to a certain set of ideas.
It's a difference between an atheist-fanatic who has a blind conviction in nonexistence of God and participates in anti-theistic color politics, and a person who has solid understanding of the natural world, and from this understanding concludes that certain set of beliefs is ridiculous.
I have two proposals (which happen to be somewhat contradictory) so I will make them in separate posts.
The first is that the real purpose of this site is to create minions and funding for Eliezer's mad scheme to take over the world. There should be more recognition and consciousness of this underlying agenda.
This is an interesting and worthwhile idea, though TBH I'm not sure I agree with the premise.
The whole "rationality" thing provides more of a framework that a status quo. People who make posts like "Well, I'm a rationalist and a theist, so there! Ha!" do tend to get voted down (when they lack evidence/argument), but I hardly see a problem with this. This community strongly encourages people to provide supporting evidence or argumentation and (interestingly) seems to have no objections to extremely long posts/replies.I have yet to see a ...
I don't know if this actually counts as a dissenting opinion, since there seems to be a conclusion around here that a little irrationality is okay. But I published a post about the virtues of irrationality (modeled after Yukowsky's twelve virtues of rationality), found here:
http://antisingularity.wordpress.com/2009/06/05/twelve-virtues-of-irrationality/
I suppose my attempt is to provide a more rational view by including irrationality but that is merely my opinion. I believe that there are good irrational things in the universe and I think that is a dissent...
I would say the direction I most dissent from Less Wrong is that I don't think 'rationality' is inherently anything worth having. It's not that I doubt its relevance for developing more accurate information, nor its potential efficacy in solving various problems, but if I have a rationalistic bent that is mainly because I'm just that sort of person - being irrational isn't 'bad', it's just - irrational.
I would say the sort of terms and arguments I most reject are those with normative-moral content, since (depending on your definition) I either do not beli...
I'm continually surprised that so many people here take various ideas about morality seriously. For me, rationality is very closely associated with moral skepticism, and this view seems to be shared by almost all the rationalist type people I meet IRL here in northern Europe. Perhaps it has something to do with secularization having come further in Europe than in the US?
The rise of rationality in history has undermined not only religion, but at the same time and for the same reasons, all forms of morality. As I see it, one of the main challenges for people...
One thing that came to mind just this morning: Why is expected utility maximization the most rational thing to do? As I understand it (and Im a CS, not Econ. major), prospect theory and the utility function weighing used in it are usually accepted as how most "irrational" people make their descisions. But this might not be because they are irrational but rather because our utility functions do actually behave that way in which case we should abandon EU and just try to maximize well being with all the quirks PT introduces (such as loss being more...
"Where do you think Less Wrong is most wrong?"
I don't know where Less Wrong is most "wrong" - I don't have a reliable conclusion about this and moreover I don't think Less Wrong community accept exceptionlessly a group of statements - but I can certainly say this: some posts (and sometimes comments) introduce jargon (i.e. Kullback-Leibler distance, utility function, priors etc.) for not very substantial reasons. I think sometimes people have a little urge to show off and reveal the world how smart they are. Just relax, okay? We all kno...
I think the group focusses too much on epistemic rationality - and not enough on reason.
Epistemic rationality is one type of short-term goal among many - whereas reason is the foundation-stone of rationality. So: I would like to see less about the former and more about the latter.
I like this idea. I don't really have anything to contribute to this thread at the moment, though.
Seems along the same lines as the "closet thread" but better.
I read LW for a few months but I haven't commented yet. This looks like a good place to start.
There are two points in LW community that seem to gravitate towards ideology IMHO:
Anti-religion. Some people hold quite rational religious believes which seem to be a big no-no here.
Pro-singularity. Some other people consider Singularity merely a "sci-fi fantasy" and I have an impression that such views, if expressed here, would make this community irrationally defensive.
I may be completely wrong though :)
Okay... first, "shut up and do the impossible" may sound like it has a nice ring to you, but there's something specific I mean by it - a specific place in the hierarchy of enthusiasm, tsuyoku naritai, isshokenmei, make an extraordinary effort, and shut up and do the impossible. You're talking enthusiasm or tsuyoku naritai. "Shut up and do the impossible" is for "reduce qualia to atoms" or "build a Friendly AI based on rigorous decision theory before anyone manages to throw the first non-rigorous one together". It is not for testing P. J. Eby's theories of willpower. That would come under isshokenmei at the highest and sounds more like ordinary enthusiasm to me.
Second, there are, literally, more than ten million people giving advice about akrasia on the Internet. I have no reason to pay attention to your advice in particular at its present level of rigor; if I'm interested in making another try at these things, I'll go looking at such papers as have been written in the field. You, I'm sure, have lots of clients and these clients are selected to be enthusiastic about you; keeping a sense of perspective in the face of that large selection effect would be an advanced rationalist sort of discipline wherein knowledge of an abstract statistical fact overcame large social sensory inputs, and you arrived very late in the OBLW sequence and haven't caught up on your reading. I can understand why you don't understand why people are paying little attention to you here, when all the feedback on your blog suggests that you are a tremendously intelligent person whose techniques work great. But to me it just sounds like standard self-help with no deeper understanding. "Just try my things!" you say, but there are a thousand others to whom I would rather allocate my effort than you. You are not the only person in the universe ever to write about productivity, and I have other people to whom I would turn for advice well before you, if I was going to make another effort.
It is your failure to understand why the achievements of others are important - why a science paper reporting the result of one experiment on willpower, has higher priority for examination by me than you and all your brilliant ideas and all your enthusiasm about them and all the anecdotal evidence about how it worked for your clients, that is your failure to understand the different standards this community lives by - and your failure to understand why science works, and why it is not just pointless formality-masturbation but necessary. Yes, there's a lot of statistical masturbation out there. But conducting a controlled experiment and quantifying the result, instead of just going by anecdotal evidence about what worked for who, really is necessary. This is not generally appreciated by human beings and appreciating that fact, that it is counterintuitively necessary to do science, that it is not obvious but it really is necessary, is one of the entrance passes to the secret siblinghood of rationalists. This is perhaps something I should write about in more detail, because it's one of those things so basic that I tend to take it for granted instead of writing about it.
As for your idea that others' attention to pay attention to you in particular indicates a willpower failure on their part... that's what we call "egocentric biases in availability", namely, you think you are a much larger part of others' mental universe than in fact you are. So much credibility as to try your suggestion instead of a million other suggestions is something that has to be earned. You haven't earned it, only berated people for not listening to you. There are communities where that works, like self-help, where people are used to being berated, but in the vaster outside universe it will get you nowhere. You have to see the universe as others see it in order to get them to listen to you, and this involves understanding that they do not see you the way you see yourself.. To me you are simply one voice among millions.
But conducting a controlled experiment and quantifying the result, instead of just going by anecdotal evidence about what worked for who, really is necessary.
Necessary for determining true theories, yes. Necessary for one individual to improve their own condition, no. If a mechanic uses the controlled experiment in place of his or her own observation and testing, that is a major fail.
"Just try my things!" you say,
I've been saying to try something. Anything. Just test something. Yes, I've suggested some ways for testing things, and so...
Occasionally, concerns have been expressed from within Less Wrong that the community is too homogeneous. Certainly the observation of homogeneity is true to the extent that the community shares common views that are minority views in the general population.
Maintaining a High Signal to Noise Ratio
The Less Wrong community shares an ideology that it is calling ‘rationality’(despite some attempts to rename it, this is what it is). A burgeoning ideology needs a lot of faithful support in order to develop true. By this, I mean that the ideology needs a chance to define itself as it would define itself, without a lot of competing influences watering it down, adding impure elements, distorting it. In other words, you want to cultivate a high signal to noise ratio.
For the most part, Less Wrong is remarkably successful at cultivating this high signal to noise ratio. A common ideology attracts people to Less Wrong, and then karma is used to maintain fidelity. It protects Less Wrong from the influence of outsiders who just don't "get it". It is also used to guide and teach people who are reasonably near the ideology but need some training in rationality. Thus, karma is awarded for views that align especially well with the ideology, align reasonably well, or that align with one of the directions that the ideology is reasonably evolving.
Rationality is not a religion – Or is it?
Therefore, on Less Wrong, a person earns karma by expressing views from within the ideology. Wayward comments are discouraged with down-votes. Sometimes, even, an ideological toe is stepped on, and the disapproval is more explicit. I’ve been told, here and there, one way or another, that expressing extremely dissenting views is: stomping on flowers, showing disrespect, not playing along, being inconsiderate.
So it turns out: the conditions necessary for the faithful support of an ideology are not that different from the conditions sufficient for developing a cult.
But Less Wrong isn't a religion or a cult. It wants to identify and dis-root illusion, not create a safe place to cultivate it. Somewhere, Less Wrong must be able challenge its basic assumptions, and see how they hold up to new and all evidence. You have to allow brave dissent.
Outsiders who insist on hanging around can help by pointing to assumptions that are thought to be self-evident by those who "get it", but that aren’t obviously true. And which may be wrong.
It’s not necessarily the case that someone challenging a significant assumption doesn’t get it and doesn’t belong here. Maybe, occasionally, someone with a dissenting view may be representing the ideology more than the status quo.
Shouldn’t there be a place where people who think they are more rational (or better than rational), can say, “hey, this is wrong!”?
A Solution
I am creating this top-level post for people to express dissenting views that are simply too far from the main ideology to be expressed in other posts. If successful, it would serve two purposes. First, it would remove extreme dissent away from the other posts, thus maintaining fidelity there. People who want to play at “rationality” ideology can play without other, irrelevant points of view spoiling the fun. Second, it would allow dissent for those in the community who are interested in not being a cult, challenging first assumptions and suggesting ideas for improving Less Wrong without being traitorous. (By the way, karma must still work the same, or the discussion loses its value relative to the rest of Less Wrong. Be prepared to lose karma.)
Thus I encourage anyone (outsiders and insiders) to use this post “Dissenting Views” to answer the question: Where do you think Less Wrong is most wrong?