Status: Endorsed, but ready to be updated as I learn local UI and (re)learn local culture.

What if everyone actually was a 'perfectly rational actor'? It's obviously not true in every sense, or the term wouldn't have been standardized. But it's also not something obviously well-known in the Rationalist community, which hasn't done a terribly good job of making centrally-planned updates to its' canonical texts.

So working from the assumption that you haven't really considered this yet, I feel like I ought to explain myself. I'm still kind of a stranger here, so pardon if the links aren't obvious.

'Smart' has always felt like a fake sort of praise to me. Partly that's because my parents were very careful to impress that "you're bad" is an unacceptable way to name someone, while "you've done wrong" is entirely reasonable - and I've always been very appreciative of symmetry.

But I don't think that praise, when I received it, was dishonest. I just didn't think it was actionable. You see, I also had a strong distrust for authority figures who didn't explain their reasoning when they told me things, no matter how otherwise-benevolent they were.

And now we come to the nub of it: I think minds are [mostly/entirely] made out of heuristics, including the parts that edit heuristics. Changing how you think of a word, or a story, or a memory, is isomorphic to changing how much you trust its' parts.

A big part of this boils down to interferometry. Or, in more a direct term, inference. See, I'm pretty sure bodies are built for efficiency, and that means reusing architecture as much as possible. Brains build dedicated circuits for faster cheaper calculation, muscle-tension works as a variably-obvious signalling mechanism to yourself as well as to others, and we offload both processing and memory storage to any source we trust with the job.

Including, of course, our own internal reasoning processes.

'Rationality' is a measure of how wise we are, in the sense of 'having good heuristics for evaluating input', multiplied by how experienced we are, in the sense of 'having lots of relevant data'. Actually, no, the general word for that is 'expertise'. Rationality is expertise with the universe we live in.

And expertise is a function of Interest and Experience.

New Comment
15 comments, sorted by Click to highlight new comments since: Today at 4:12 AM

You seem to still want feedback on this, so I want to expand on why it's incoherent.

You seem to be making several arguments and then keep jumping to the next one with no obvious link between them:

  1. A thought experiment about people being perfectly rational (I think? you might be suggesting the opposite and I can't tell) that doesn't go anywhere
  2. An aside about not liking certain types of praise/criticism that doesn't go anywhere
  3. An argument that our brains mostly work with heuristics which I think most people here would agree with
  4. A new definition of rationality that inserts experience out of nowhere

Note: Your introduction also kind-of implies that the Rationalist community is stupid, which probably didn't win you a lot of points (especially since you don't justify why our canonical texts should say that everyone is perfectly rational and/or use your definition of rationality).

Overall, the problems are (1) there's a lot of text that doesn't anywhere and (2) it's not clear to me what I'm supposed to get out of this. Why should I use your definition of rationality? Is there a reason other people should adopt your opinions about praise/criticism? What does the thought experiment about perfectly rational people have to do with this?

There are lots of different ideas out there and not all of them explored by all people. There are plenty of ideas people talk about on LessWrong that haven't appeared in the sequences. 

[-]TAG3y40

...otherwise , you're on the heuristics side of bias-versus-heuristics. It's an old idea in the mainstream, but widely ignored here.

Ah, language difficulties. Sorry. Will update for 'extremely literate crowd, even for nerds' then.

Why do you think it's ignored here? Just so widely accepted as to be invisible, or?

[-]TAG3y10

No. The Gigerenzer approach was never promoted by Yudkowsky, so it isn't "there" as far as his followers are concerned.

Hm. Gotta say, I'm disappointed with how inarticulate the criticism has been here. Perhaps it's because Karma is supposed to be defined elsewhere, but if so, that seems like an issue with the Karma system.

Perhaps the problem is that my writing style is kind of intense, and thus reads as persuasion-coded rather than explanation-coded to this audience? I'd be happy to fix that, but I'll need a guide if it's going to happen any time soon.

(Relatedly, does anyone have a good guide for 'subtext in general'? Connotation varies even more than denotation, of course, but just really diving in to exploring connotations would be nice. Twig was valuable to me for that reason, even if it's hard to recommend on other counts.)

We like intense. Persuasion is bad, but that's not the problem. The problem is that your post is incoherent. It lacks a central thesis.

[Meta: I wrote this comment because you specifically requested more negative feedback.]

It doesn't read as persuasion-coded to me. In fact it reads as stream-of-consciousness musing that defeats its own opening point.

What if everyone actually is a perfectly rational actor?

[...]

Rationality is expertise with the universe we live in.

You're wondering what if everyone has perfect expertise with the universe we live in? Furthermore this is somehow linked to fake praise, your strong distrust for authority figures who tell you things without explaining their reasoning, and the idea that muscle-tension works as a variably-obvious signalling mechanism to yourself as well as to others?

Well maybe this makes internal sense to you, but it looks incoherent to me.

I was going for a style of writing I'm familiar with in explaining useful things - question to conclusion, I don't know the formal name for it - and then I wrote the title last, so that people who already knew the core point wouldn't need to waste time getting the evidence.

I'm not sure excluding this style from LessWrong actually improves Less Wrong's efficiency at sharing useful knowledge? Tagging it better would be good, of course.

I was going for a style of writing I'm familiar with in explaining useful things - question to conclusion, I don't

So the style is based around making assertions, and if people don't think the point is obvious, they ask for evidence/ask what you mean?

Do you have any (other) examples of the style?

Seconding that this post seems incoherent, kind of like a half-baked shower thought.

I can see where you're coming from, but tracing every connection is very difficult, because beliefs/heuristics are based on whole networks of data, which I think are stored as smaller heuristics. Efficiency demands that I not explain more than I must to get my point across. Not just on my end, either. This is why target audience is useful.

...Thinking about it, I should see if I can optimize the site-intro stuff. A proper style guide for posting and reading seem like they'd have big advantages, although they would obviously need justification.

Efficiency demands that you actually get your point across, otherwise your efficiency is zero points-got-across per thousand words.