Democracy was in crisis. Politicians peddled demagogic bullshit. Reason, a small candle in the dark, faced existential threats

It was the 5th century BC. If Athens divided, then its neighbors would destroy it. They would massacre the men, enslave the women, and burn the library.

A traitor appeared in the city square. He wrote memetic viruses with the purpose of tearing down the social infrastructure holding Athens together.

The traitor was brought to lawful, legal trial. He was offered merciful exile, so that he could disrupt the bad guys instead. Instead, the traitor choose death.

Through his blood sacrifice, the rationalist Western intellectual tradition was born.


What is the most fundamental optimization target of rationality?

  • To be less wrong?
  • To be more right?
  • To save the world?
  • To take a heretical stand and state "E Pur Si Muove"?

Mu. The most basic rationalist precept is to not forcibly impose your values onto another mind.


If I said [to him] "You're wrong;  is true", then—even if he listened to me—he wouldn't have thought through it. He'd just be copying my ideas. Which isn't what I want.  I don't want people to copy my beliefs. I want people to think sensibly.

For a long time, when I'm walking people through this, they won't even know what my beliefs are. Usually they'll think I agree with them. But if they think about it hard, they'll realize that they have no idea what I believe, because I have to clean up their idea to reason before that is even relevant.

Debt Forgiveness: A Case Study in Persuasion

New to LessWrong?

New Comment
6 comments, sorted by Click to highlight new comments since: Today at 6:38 PM
[-]Razied7mo2214

The most basic rationalist precept is to not forcibly impose your values onto another mind.

What? How does that make any sense at all? The most basic precept of rationality is to take actions which achieve future world states that rank highly under your preference ordering. Being less wrong, more right, being bayesian, saving the world, not imposing your values on others, etc. are all deductions that follow from that most basic principle: Act and Think Such That You Win.

I find it useful to distinguish between epistemic and instrumental rationality. You're talking about instrumental rationality – and it could be instrumentally useful to convince someone of your beliefs, to teach them to think clearly, or to actively mislead them. 
Epistemic rationality, on the other hand, means trying to have true beliefs, and in this case it's better to teach someone to fish than to force them to accept your fish.

If I said [to him] "You're wrong;  is true", then—even if he listened to me—he wouldn't have thought through it. He'd just be copying my ideas. Which isn't what I want.  I don't want people to copy my beliefs. I want people to think sensibly.

For a long time, when I'm walking people through this, they won't even know what my beliefs are. Usually they'll think I agree with them. But if they think about it hard, they'll realize that they have no idea what I believe, because I have to clean up their idea to reason before that is even relevant.

 

I watched the rest of that video (the person was telling their story of asking pointed questions to a "socialist" about their policy of funding college education). I think the problem with this is that while this might marginally move a person closer to actually thinking, it is unclear to me to which degree this is a symmetric weapon. 

Asking the "right" open ended questions seems pretty powerful in leading someone astray. 

Asking the "right" open ended questions seems pretty powerful in leading someone astray.

It is somewhat more difficult than leading them the right way. Also, you may start the habit of thinking, which may continue after you stop asking the questions, so it no longer goes in a direction you control.

But yes, it is a question of degree. You can mislead people by:

  • bringing their attention to some things (privileging a hypothesis), and simultaneously taking it away from other things (because attention is a limited resource);
  • leveraging their existing incorrect beliefs to make them conclude wrong things even from correct data (rather than examine those beliefs);
  • and of course, it is never just asking questions, but also subtly making assumptions, etc.

Mu. The most basic rationalist precept is to not forcibly impose your values onto another mind.

It is? Last I heard, the two most basic precepts of rationality were:

  1. Epistemic rationality: systematically improving the accuracy of your beliefs.
  2. Instrumental rationality: systematically achieving your values.

(Typically with a note saying "ultimately, when at odds, the latter trumps the former")

You have a youtube channel? Subscribed.