Un-untitled.
Do you like LW ?
Anonymous

plain-dealing-villain:

antinegationism:

plain-dealing-villain:

antinegationism:

Eh. I like Less Wrong in the same way I like logic puzzles, but with regard to drinking its koolaid I have a few reservations.

The first is that the preoccupation with eliminating cognitive biases strikes me as naive and dangerous. Cognitive biases were naturally selected for, and to decide you should do away with them to the best of your ability without thinking really really really REEEEEAALLLYYY long and hard about why they are there, simply because they might occasionally convince you that buying a lottery ticket is a good idea, seems like the sort of proposition liable to lead you massively astray on either a personal, interpersonal, group, or inter-group level. (Pro tip: put a lot of weight on trusting any bias that, for reasons you can’t quite put your finger on, just gives you a bad feeling about someone, and avoid that someone accordingly. Hobbyist tip: don’t do it blindly when you can put your finger on it. It’s not a free pass to be racist, it’s a free pass to avoid sociopaths.) Probably healthier than eliminating as many known biases as you can is to take stock of which biases stand to do the most harm, and do your best to avoid letting them do harm on a case by case basis. If you feel like humanity should generally be rid of some bias, I suggest not leading by example, but instead voting with your balls or uterus by mating with a partner that has less of a natural inclination for that bias. Let future generations sort out the relative viability of a reduction in this bias. 

If you don’t intend to mate because you can’t see any rational reason to – then no worries, you’re already selecting yourself out and the whole “your biases are there for a reason” argument is beside the point. 

(Amusingly, Less Wrong has (had? At least last I checked?) a whole thematic dance with Bayesian-ism and Bayesian Networks and Artificial General Intelligence, and I’m going to giggle when the first A-quote-G-endquote-I to pass the Turing test turns out to be stuffed with so many cognitive biases that it might quite plausibly seek therapy.)

Less Wrong really likes arguing and playing devil’s advocate, so probably the community has already advocated for some if not all of the devils above and possibly come up with some bullet-proof argument(s) against them, or otherwise softened goals to accommodate. My opinion is that that general realm of critique is too gigantic a thing to not be pointing out constantly. 

Here’s a fun graph:

image

The above shows how long communes last (vertical axis) versus the number of costly requirements that commune demands of its members (horizontal axis).

The secular communes, lacking any rational reason to abide costly requirements, tend to show an inverse correlation between the number of costly requirements and commune longevity, while the religious communes show a strongly positive correlation between costly requirements and commune longevity. One might of course argue that at 0-2 sacrifices, the difference is within measurement error, but even so, communes with few to no costly requirements (irrespective of religiosity) will be massively outlasted by communes with more than 11 costly requirements.

So here we see two really weird irrational biases (spirituality and Sunk Cost fallacy) working in tandem to be massively beneficial at a group level.

Of course, one could argue that once scientists invent magic and we achieve transhumanist VR immortality, all of those biases that have served us so well evolutionarily will become totally irrelevant. Which, okay, maybe. Or maybe they will become more relevant. Or maybe a whole new set of biases even more ridiculous than the ones we’re currently cursed with will be necessary. Seems like the sort of bridge one shouldn’t cross before arriving at. 

My other reservation basically just amounts to the first chapter of Beyond Good and Evil.

But past that, I totally agree that inductive logic puzzles are fun and dishonest arguments and rhetoric are bad (except maybe at a group selection level, but ffs people, don’t shit where you eat).

I welcome any and all ire this post may cause, and reserve the right to ignore it  until such time as I can justify further procrastination.

>So here we see two really weird irrational biases (spirituality and Sunk Cost fallacy) working in tandem to be massively beneficial at a group level.

Is that a benefit, though? It could just be an indicator that the communes stick together long after they’re net-negative for everyone in them. My intuitive understanding of how those biases work would definitely indicate that they would stick together even if strongly net-negative, even if that’s not the modal outcome.

Historically, the alternative to sticking together was bears. 

In other words, people sticking together even when it sucks balls, and then very slowly over multiple generations proliferating in the resulting potentially emotional net-negative (but procreative net-positive) context, while taking advantage of the relative stability this affords to figure out rules to minimize the quantity of balls contextually sucking, is how societies are formed.

An argument could be made that now that there are an abundance of developed secular societies from which to choose (each of which with their own developed and now-longstanding and stable social institutions of concrete and rationally justifiable purpose) humans can finally afford to mature past the irrational biases that enabled the formation of those societies. An argument could even be made, albeit at a stretch, that the biases which once served as boons serve only as dead weight in this brave new cosmopolitan context. 

A counter argument could also be made that a lot of people REALLY like to watch sports. And almost everyone (in the US) gets REALLY into presidential elections (despite not having a clue what their local representative’s name is). And almost everyone (in the UK, and for some reason the US) is REALLY into royal weddings. And can you even believe the latest Taylor Swift / Kanye drama?

The magnitude with which people care about these things is absurdly disproportionate to the minimal bearing any of these things actually have on their lives. And yet, there are (to my knowledge) no large scale societies devoid of analogous social or political phenomena. 

Maybe such a rational cosmopolitan society could conceivably exist (if only things were done “right”). Obviously, the fact that something is one way doesn’t mean it can’t be another. But it’s worth noting just how much the social institutions of concrete and rationally justifiable purpose which are the hallmarks of cosmopolitan societies regularly and purposely leverage irrational group tendencies in order to secure funding or standing. I think this is very strong evidence that the same or similar irrational biases which were at work on communal scales are (if not exactly necessary, then at least) really extremely useful for the unity and predictability that allows for the rational secular institutions which enable cosmopolitan scales.

If I’m honest, this seems almost obvious to me. People have limited cognitive and emotional bandwidth. It’s difficult to fit all of the knowledge required to consider all of the consequences of a given proposition into your head, let alone to bring yourself to gauge the probabilities of any of the huge number of potential outcomes of any given change in law, or even to determine which of those outcomes you value. And then to care enough to even vote on them once all of the pros and cons have been considered. Much more viable to just pick whichever douche you think is the most charming, (as determined by letting yourself get caught up in the charm), and then hope s/he charms all of the other slightly less charming douches into figuring out how to navigate a potentially terrible idea so as not to turn out intolerably bad.  And the charming douche and sub-douches will have some natural incentive to do this, because a lot of people will consider the douchiness more than the charm in the event of a socio-economic / geopolitical train-wreck. 

That does not seem like the obvious conclusion to me. Obviously the heuristics and biases exist, and we can infer that in the evolved ancestral environment they were more useful than the absence of them, or at least equally good, from a ‘maximize inclusive fitness’ standpoint.

Similarly, it is obvious that people do care about the British royals, American celebrities, sports rivalries, political rivalries, etc. That doesn’t really have any bearing on whether they should care, beyond that it clearly isn’t quickly-lethal at an individual or population level, or we’d have already stopped doing it.

Given that those biases exist, organizations which want to self-propagate and/or self-perpetuate do it most efficiently by taking advantage of them. They benefit from doing this even at the expense of their primary goals. And we do see this; I can’t remember the name of the charity that gets shamrock and turkey stickers stuck up on the windows of local Massachusetts supermarkets every year, but I do remember that for every dollar they collect 90¢ goes to the salaries of their professional fundraisers. And that the military expands its budget even at the cost of wartime effectiveness (see the saga of the Bradley Fighting Vehicle).

We agree that these biases exist, are used, and were at some point useful. That seems obvious from the facts on the ground. And I think we also agree that ceasing to be biased is very difficult, maybe impossible, and reducing bias enough to matter is difficult, maybe very difficult. I don’t think, though, that you’ve really presented an argument that addresses the disagreement, which is “is it a good thing to be biased in these ways?”

My short argument of why I think the answer is ‘no’ is roughly “Past performance is no guarantee of future results.” Biases that served us well in the past have no reason to continue serving us well in the future. 

I don’t have any opinion as to whether or not they are ‘good.’ I note only the lack of known viable alternatives. 

  1. chillybois reblogged this from antinegationism
  2. antinegationism said: We have plenty of evidence to suggest that they are useful in the modern world though. Like, the fact that they are often *used*. Just ask a politician. Or anyone in marketing.
  3. jiskblr reblogged this from antinegationism and added:
    In the absence of reason to think they’re helpful in the modern world, defaulting to truth-seeking seems like a better...
  4. antihumanism reblogged this from tetraspace-west
  5. antinegationism reblogged this from jiskblr and added:
    I don’t have any opinion as to whether or not they are ‘good.’ I note only the lack of known viable alternatives.
  6. another-normal-anomaly reblogged this from cyborgbutterflies
  7. cyborgbutterflies reblogged this from another-normal-anomaly and added:
    I’m not a rationalist but I did read some of Thinking, Fast & Slow ages ago and as far as I can tell the following ideas...