LESSWRONG
LW

Personal Blog

1

Practicing what you preach

by TwistingFingers
23rd Oct 2011
2 min read
298

1

Personal Blog

1

Practicing what you preach
10fiddlemath
8anotheruser
0Logos01
4pedanterrific
-6Logos01
2lessdazed
2Logos01
1hairyfigment
0Logos01
5hairyfigment
0wedrifid
-1Logos01
1dbaupp
-3Logos01
2dbaupp
-3Logos01
0dbaupp
0Logos01
4dbaupp
0Logos01
0dbaupp
-4Logos01
0dbaupp
-2Logos01
0dbaupp
-1pedanterrific
0Logos01
3wedrifid
-8Logos01
4wedrifid
-8Logos01
5[anonymous]
-5Logos01
1[anonymous]
0dlthomas
0[anonymous]
1dlthomas
0[anonymous]
-7Logos01
5[anonymous]
-4Logos01
5[anonymous]
-14Logos01
5wedrifid
5[anonymous]
-8Logos01
2Desrtopa
-4Logos01
9Desrtopa
-2Logos01
5Desrtopa
-3Logos01
5Desrtopa
-1Logos01
2Desrtopa
2Logos01
5pedanterrific
0Logos01
3pedanterrific
-1Logos01
4Shmi
-3Logos01
5JoshuaZ
-1Logos01
2pedanterrific
0Logos01
2JoshuaZ
0Logos01
2JoshuaZ
0Logos01
0JoshuaZ
0Logos01
0JoshuaZ
0Logos01
0JoshuaZ
0Logos01
2JoshuaZ
-3Logos01
2JoshuaZ
2Logos01
1Desrtopa
2pedanterrific
0Desrtopa
4pedanterrific
0Desrtopa
0Logos01
2pedanterrific
2Logos01
2pedanterrific
0Logos01
0Desrtopa
2Logos01
0Desrtopa
2Logos01
0Desrtopa
0Logos01
0Desrtopa
-2Logos01
2Desrtopa
-2Logos01
2lessdazed
0Logos01
1Desrtopa
0wedrifid
0Logos01
0wedrifid
-3Logos01
0asr
2Logos01
2asr
0Logos01
2asr
0Logos01
0asr
0TheOtherDave
0Desrtopa
0TheOtherDave
1Desrtopa
0Logos01
3JoshuaZ
0Logos01
5Jack
0Logos01
0Jack
0Logos01
3Jack
0JoshuaZ
0Logos01
3Jack
-2Logos01
2wedrifid
0lessdazed
1wedrifid
3lessdazed
2wedrifid
0Logos01
4lessdazed
0Logos01
2lessdazed
2Logos01
2Jack
0Logos01
3Jack
0Logos01
7TheOtherDave
2wedrifid
-4Logos01
4JoshuaZ
2Logos01
5JoshuaZ
0Logos01
3JoshuaZ
-2Logos01
7JoshuaZ
3wedrifid
-3wedrifid
-4Logos01
0wedrifid
-6Logos01
2wedrifid
-2Logos01
0pedanterrific
0Logos01
0pedanterrific
0Logos01
0pedanterrific
0Logos01
2pedanterrific
2Logos01
-1wedrifid
0Logos01
-4Logos01
6pedanterrific
0Logos01
2pedanterrific
0Nornagest
0pedanterrific
0Nornagest
0Jack
2Nornagest
-2Logos01
3dlthomas
-3Logos01
0dlthomas
2wedrifid
-5Logos01
9Desrtopa
0Logos01
3Jack
-9Logos01
5JoshuaZ
-6Logos01
2JoshuaZ
-2Logos01
0JoshuaZ
-2Logos01
0JoshuaZ
0Logos01
0JoshuaZ
2Logos01
0JoshuaZ
0Logos01
0JoshuaZ
4Logos01
0JoshuaZ
0Logos01
4JoshuaZ
0Logos01
0JoshuaZ
0Logos01
0JoshuaZ
0Logos01
0pedanterrific
0JoshuaZ
2pedanterrific
0Logos01
0Desrtopa
0Logos01
4Jack
2Dorikka
0Logos01
3Dorikka
1Desrtopa
0Logos01
1Desrtopa
0Logos01
4Desrtopa
-4Logos01
2Desrtopa
-5Logos01
6wedrifid
3wedrifid
4Desrtopa
-6Logos01
2wedrifid
-10Logos01
9wedrifid
6[anonymous]
6[anonymous]
-2Logos01
2[anonymous]
2lessdazed
2[anonymous]
0wedrifid
0lessdazed
0wedrifid
0lessdazed
-6Logos01
5Jack
4[anonymous]
5wedrifid
-8Logos01
0Desrtopa
-2Logos01
2Desrtopa
-4Logos01
0Desrtopa
0pedanterrific
2ArisKatsaris
-4Logos01
-1ArisKatsaris
0Logos01
2JoshuaZ
-6Logos01
1pedanterrific
-8Logos01
2pedanterrific
-1Logos01
0pedanterrific
2Logos01
2pedanterrific
7Luke_A_Somers
7[anonymous]
11orthonormal
9wedrifid
0fiddlemath
2FiftyTwo
2[anonymous]
6orthonormal
6Prismattic
3ahartell
0TwistingFingers
5[anonymous]
3TheOtherDave
-6TwistingFingers
4pedanterrific
0Logos01
4Bobertron
10gjm
4ahartell
2[anonymous]
2FiftyTwo
2[anonymous]
2FiftyTwo
1antigonus
0Armok_GoB
0MrMind
0TwistingFingers
New Comment
298 comments, sorted by
top scoring
Click to highlight new comments since: Today at 2:51 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings
[-]fiddlemath14y100

It is probably simply structural that the LessWrong community tends to be about armchair philosophy, science, and math. If there are people who have read through Less Wrong, absorbed its worldview, and gone out to "just do something", then they probably aren't spending their time bragging about it here. If it looks like no one here is doing any useful work, that could really just be sampling bias.

Even still, I expect that most posters here are more interested to read, learn, and chat than to thoroughly change who they are and what they do. Reading, learning, and chatting is fun! Thorough self-modification is scary.

Thorough and rapid self-modification, on the basis of things you've read on a website rather than things you've seen tested and proven in combination, is downright dangerous. Try things, but try them gradually.

And now, refutation!

So what's the solution?

To, um, what, exactly? I think the question whose solution you're describing is "What ought one do?" Of these, you say:

It wont feel like the right thing to do; your moral intuitions (being designed to operate in a small community of hunter gatherers) are unlikely to suggest to you anything near t

... (read more)
Reply
[-]anotheruser14y80

Speaking about it would undermine your reputation through signaling. A true rationalist has no need for humility, >sentimental empathy, or the absurdity heuristic.

Depending on your goal (rationality is always dependend on a goal, after all), I might disagree. Rational behaviour is whatever makes you win. If you view your endeveaur as a purely theoretical undertaking, I agree, but if you consider reality as a whole you have to take into account how your behaviour comes across. There are many forms of behaviour that would be rational but would make you look like an ass if you don't at least take your time to explain the reasons for your behaviour to those that can affect your everyday life.

Reply
0Logos0114y
Rational behavior is whatever conforms to the principles of reason. Instrumentally rational behavior is whatever is the most rational behavior that achieves the expected agenda. You could call that latter form "winning" but that's an error, in my opinion. It seems related to the notion that since "winning" makes you "feel good", ultimately all agendas are hedonistic. It screams "fake utility function" to me. Sometimes there isn't a path to optimization; only to mitigation of anti-utility.
4pedanterrific14y
If some particular ritual of cognition—even one that you have long cherished as "rational"—systematically gives poorer results relative to some alternative, it is not rational to cling to it. The rational algorithm is to do what works, to get the actual answer—in short, to win, whatever the method, whatever the means.
-6Logos0114y
[-]Luke_A_Somers14y70

If nothing else, the assertion that the right and rational think will not feel like the right thing to do really needs support. Our moral intuitions may not be perfect, but there are definite parallels between small communities of hunter-gatherers and modern society, that make a fair portion of those intuitions applicable presently.

That it was just laid out without even a reference to back it up… come on, here.

Reply
[-][anonymous]14y70

I checked TwistingFingers's post history, and I noticed that he is also the author of a post entitled LessWrong gaming community.

Choice quote: "Many of us enjoy expressing ourselves through electronic games."

Quite how this squares with his aspiration to become an optimization process is beyond me. Optimizing for lulz, maybe.

Reply
[-]orthonormal14y110

This is DH1.

(I also see the OP as more signal than noise. But the norm for rebuttal here should usually be DH4 or higher.)

Reply
9wedrifid14y
Not everything need be a rebuttal. Incidentally, people constrained to DH4 or higher are gameable by common social practice.
0fiddlemath14y
Certainly, not every reply needs to be a rebuttal. But it usually is, here. On the other hand, if you're going to rebut, and you think the other party is trying to argue honestly, your lower bound really should be around DH4 (counterargument) in a setting with many speakers. In a private setting, simply disagreeing (DH3) can be useful to just explain internal state. "I disagree with X, but I'm not sure why. Hm..." But it's logically rude to state simple disagreement as if it were an actual argument. :)
2FiftyTwo14y
(I rather like this system of using DH shorthands for diagnosing the problems with peoples comments. Possibly we can develop similar systems for other logical issues.)
2[anonymous]14y
It wasn't intended as a rebuttal; I have already provided that in another lengthy comment. I was merely identifying TwistingFingers as a blatant troll. Just for fun: Juxtapose that with "Just do something: every moment you sit hundreds of thousands are dying and billions are suffering" written less than one month later. Applause light/ more claims without evidence. An utterly ludicrous implication. This sounds like Chomskybot applied to Lesswrong jargon. Can you really not see that this guy is taking the Mickey?
6orthonormal14y
Another plausible interpretation of TF's flip-flopping is that a month ago, xe was here because xe thought it was a fun community, and then xe got "converted" into an earnestly zealous and quite naive Singularitarian. Much of TF's vitriol, then, would implicitly target xer lackadaisical past self in order to (consciously or unconsciously) distance xer current self from the pre-conversion self. Mind you, I'm not checking TF's history myself, so this might be a bad guess. I'm just pointing out a pretty plausible alternate hypothesis.
6Prismattic14y
I realize that this a trivial issue, but if you care about inferential distance, I thought you should know that I had to look this expression up, and I suspect a lot of other non-UK readers would as well.
3ahartell14y
For those who don't know, Urban Dictionary says that "taking the Mickey" means "joking, or doing something without intent".
0[anonymous]14y
Even Yudkowsky says he disagrees with much of his earlier writing. I have been so transformed by reading the sequences that I have made that much progress in so little time.
[-][anonymous]14y50

LessWrongers as a group are often accused of talking about rationality without putting it into practice

Evidence? Who accuses them of this? One post (on Less Wrong itself!) is not evidence enough for this claim.

who gets to be in our CEV

Since this barb is directed at me, I should respond. When I come across a superb intellect like Yudkowsky, I first shut up and read the bulk of what he has to say (in Yudkowsky's case, this is helpfully packaged in the sequences). Then I apply my modest intellect to exploring the areas of his thinking that I do not fin... (read more)

Reply
3TheOtherDave14y
For both subjects, if discussing them doesn't make someone better able to do something worth doing, then discussing it is not worthwhile. If it does make someone better able to do something worth doing, discussing it might be worthwhile. It seems plausible to me that my reading, writing, and thinking about cognitive biases can noticeably help improve my understanding of, and ability to recognize, such biases. It seems plausible to me that such improvement can help me better achieve my goals. Ditto for other people. So I conclude that such discussion might be worthwhile. It doesn't seem plausible to me that my reading, writing and thinking about CEV can noticeably help improve anyone's ability to do anything.
-6[anonymous]14y
[-]pedanterrific14y40

I declare Crocker's rules on the writing style of this post.

Okay then:

in an affective death spiral

I think a more appropriate buzzword might be evaporative cooling of group beliefs. It's not immediately clear how "armchair rationalists" would be more predisposed to affective death spirals than instrumental rationalists.

altruism such as "efficient charity"

altruism such as Efficient Charity. (Note the period.)

It wont feel

It won't feel

being designed to

having evolved to

Reply
0Logos0114y
If you take it as axiomatic that instrumental rationalists are putting labor and effort into the material manifestations of instrumental rationality whereas 'armchair' rationalists merely discuss these ideas, then it becomes a necessity that the former be 'more rational' than the latter. And moreover, relegating the topic to a point of discourse without instantiation can be a form of affective death spiral. Not that I necessarily agree with anything else in this post or thread -- just commenting on that point.
[-]Bobertron14y40

This behavior is particularly insidious because it is self-reinforcing

As I understand your post, the behavior you mean is talking about rationality without putting it into practice. But the way it is written sound to me like you mean accusing LW oftalking about rationality without putting it into practice.

A recent attempt to counter this trend or at least make us feel better about it was a series of discussions on "leveling up": [...] stands in stark contrast to articles emphasizing practical altruism such as "efficient charity"

... (read more)
Reply
[-]gjm14y100

I think you are doing it wrong.

My reading of TwistingFingers's words was that s/he did mean "please feel free to be harsh about me", not "I wish to be free to be harsh about others". I don't see what other interpretation is possible, given "on the writing style of this post".

Reply
4ahartell14y
I think your interpretation is correct, and that's how I interpreted it, but I can understand Bobertron's interpretation as well. He thought TwistingFingers was declaring Crocker's rules as a sort of apology for the accusatory "writing style of [the] post", which would as Bobertron suggests be using the declaration in the wrong direction. I only say this because you wrote:
[-][anonymous]14y20

Speaking about it would undermine your reputation through signaling. A true rationalist has no need for humility, sentimental empathy, or the absurdity heuristic.

Under your judgement your plan can self-modify in the future to overcome its flaws. Become an optimization process; shut up and calculate.

Considering the problems you bring up, I think Less Wrong may benefit from increased categorization of thought by adding new levels other than Main and Discussion. And considering your advice, I'll try not to be overly humble/nice about it.

How wide of a net d... (read more)

Reply
[-]FiftyTwo14y20

Meta-comment: most replies at time of posting seem to be questioning whether a problem exists and quibbling with the style of the post, rather than proposing solutions. This doesn't seem like a good sign.

Proposed solution: If we consider rationality as using the best methods to achieve your goals (whatever they may be) then there are direct ways the Less Wrong material can help.

Firstly, define you goals and be sure that they are truly your goals.

Secondly when pursuing your goals retrieve information as needed that helps you make better decisions and hence... (read more)

Reply
2[anonymous]14y
Objection: it is highly irrational to propose solutions to non-existent problems. Insofar as someone considers the OP to have failed to raise a genuine problem, there is every reason for them not to start proposing solutions. Furthermore, as another commenter has pointed it is an act of generosity to interpret him as having coherently stated any particular problem at all.
2FiftyTwo14y
My interpretation of the original post was that they were identifying the problem that LW posters are 'talking about rationality without putting it into practice .' I then attempted to give an example of how one could instrumentally use the rationality techniques discussed on the site to achieve ones goals. Whether or not it is the case that LW is failing to apply rationality techniques enough is an empirical question that I agree the OP hasn't proven. However whether or not it is the case demonstrations of how instrumental rationality might work still seem to be a useful exercise. My top comment was semi-flippantly pointing out that commenters are doing what the OP accused them of by discussing the post rather than what seems the more useful task of proposing solutions. Possibly I am interpreting the OP generously in the problem they are presenting, but I don't understand why this is a bad thing. When meaning is uncertain surely it is best to assume the most creditable interpretation in order to move discussion forward? (And contributes to general norms of politeness.)
[-]antigonus14y10

I don't really understand what the problem you're diagnosing is supposed to be or what it is you're asking for.

Reply
[-]Armok_GoB14y00

First silly thing coming to mind: "Use rationality to determine an end goal, and a rational authority to trust. Then condition yourself to follow both blindly and without exception.Then stop caring about if you're being rational or not. "

Yea, it's silly. No, I'm not endorsing it or even saying it's any less silly than it sounds. But it DOES fulfil your criteria.

Reply
[-]MrMind14y00

A true rationalist has no need for humility, sentimental empathy, or the absurdity heuristic.

Surely if those things go against the Grand Maximally Efficient Thing To Do, they should be shed away. But in general, if they are not an obstacle, they make our life a little more pleasant. Ah, but a true human rationalist can really do without humility, sentimental empathy or the absurdity heuristic? Are those things something humans can do without, if they want to?

And more: how do you know that the Solution is the correct Solution?

Reply
[-][anonymous]14y00

I'm aware of the typos. I am not allowed to edit for 6 more minutes. Please don't respond to this thread as I will delete it once I have edited the OP.

[This comment is no longer endorsed by its author]Reply
Moderation Log
More from TwistingFingers
View more
Curated and popular this week
298Comments

LessWrongers as a group are often accused of talking about rationality without putting it into practice (for an elaborated discussion of this see Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality). This behavior is particularly insidious because it is self-reinforcing: it will attract more armchair rationalists to LessWrong who will in turn reinforce the trend in an affective death spiral until LessWrong is a community of utilitarian apologists akin to the internet communities of anorexics who congratulate each other on their weight loss. It will be a community where instead of discussing practical ways to "overcome bias" (the original intent of the sequences) we discuss arcane decision theories, who gets to be in our CEV, and the most rational birthday presents (sound familiar?).

A recent attempt to counter this trend or at least make us feel better about it was a series of discussions on "leveling up": accomplishing a set of practical well-defined goals to increment your rationalist "level". It's hard to see how these goals fit into a long-term plan to achieve anything besides self-improvement for its own sake. Indeed, the article begins by priming us with a renaissance-man inspired quote and stands in stark contrast to articles emphasizing practical altruism such as "efficient charity"

So what's the solution? I don't know. However I can tell you a few things about the solution, whatever it may be:

  • It wont feel like the right thing to do; your moral intuitions (being designed to operate in a small community of hunter gatherers) are unlikely to suggest to you anything near the optimal task.
  • It will be something you can start working on right now, immediately.
  • It will disregard arbitrary self-limitations like abstaining from politics or keeping yourself aligned with a community of family and friends.
  • Speaking about it would undermine your reputation through signaling. A true rationalist has no need for humility, sentimental empathy, or the absurdity heuristic.

Whatever you may decide to do, be sure it follows these principles. If none of your plans align with these guidelines then construct a new one, on the spot, immediately. Just do something: every moment you sit hundreds of thousands are dying and billions are suffering. Under your judgement your plan can self-modify in the future to overcome its flaws. Become an optimization process; shut up and calculate.

I declare Crocker's rules on the writing style of this post.