Wiki Contributions


The ecological rationality of the bad old fallacies


But is jealousy pathological? Or anger? Or fear?

I was arguing that the nerves in the skin are only an approximation of thermometers, likewise the eyes only a poor measure tool. By the way, there are 'evolutionary' biases: we perceive a ravine as deeper when we look down onto it and, conversely, from the bottom looking up it doesn't seem as tall (see also auditory looming). Their function is quite transparent once you think about organisms and not measure tools.

The ecological rationality of the bad old fallacies

It seems to me that it is the discussion about optimizing versus satisficing.

If Intel builds the computer to do some division, but they found a way to approximate the results because that way the CPU can simulate, I don't know, a nuclear explosion, it should say so. But in our case, we need God to say that the nerves in the skin are thermometers, the eyes, height measuring tools and so on. The only utility function of organisms that we now for sure is that the code that build them has to make it in the next generation; we can argue about different strategies, but they depend on - sometimes - too many other things.

The ecological rationality of the bad old fallacies

I think any preoccupation, if it exists long enough, results in great refinements. The are people good a African rare languages, mineral water, all sorts of (noble!) sports, torture - why should't people get better at something as common as argumentation.

But we're advocating a look the other way around, to the more basic processes, they may say something about how humans work. And indeed, it would be easier with less sophisticated arguers.

The ecological rationality of the bad old fallacies

I have to admit that the text is a bit long! We sorta did say all of that you are saying, which means that the way I resumed the text here was a bit misleading.

There must be conditions when a heuristic like "follow the majority opinion" must be triggered in our heads: something is recognized maybe. There is selection pressure to find social exchange violation, but also to be ingenious in persuasion. Some of this already has experimental support. Anyway, we think that what we today call fallacies are not accidents - like the blind spot. They are good inference rules for a relatively stable environment, but cannot predict far into the future and cannot judge new complex problems. That may be why we don't spot the fallacies of small talk, of experts in domains with expertise, or in domains for which we already have intuitions.

That would imply that a bad decision today is not necessarily the product of a cognitive illusion, but that we build a bad interface for the actual human mind in the modern world (a car will be lighter and faster if it shouldn't accommodate humans). Reference class forecasting or presenting probabilities as frequencies are just technologies, interfaces. The science is about the function and the fallacies are interesting precisely because, presumably, they are a repetitive behavior. They may help in our effort to reverse engineer ourselves.

The ecological rationality of the bad old fallacies

You are correct; but the Argument from fallacy is still pretty uninformative.

Best of Rationality Quotes, 2013 Edition

I think it's Daniel Dennet (said to Hofstadter).

The Cognitive Science of Rationality

You might be right - as I never saw one - but the project didn't start with a plan to built a spectacular flying sculpture. So they fell first to the planning fallacy (which may not be so much a psychological cognitive bias but the very structure of possible outcomes of everything - the top of the frequency distribution is to the right of the "arrival" time), then to sunk costs which later were half acknowledged, thus making them highly suspicious of trying to resolve a cognitive dissonance (rationalization).

One has to take into account the original prediction to make a probabilistic interpretation...

The Dilemma: Science or Bayes?

The science world, as much as the rest of the "worlds" comprised by people who share something which everybody cherishes, has to have the status quo bias. (the enigmatic add-on: One cannot escape the feeling that there is such thing as time)

SotW: Check Consequentialism

Kahneman suggests such an exercise for groups after pointing out that organizations generally act more rationally than individuals. The devil's advocate role and thinking at the worst possible outcome. We don't always have the luxury of having others near us for checking our thoughts. But we often have imaginary conversations with friends or parents. So it shouldn't be very difficult to assign a devil's advocate position to a imaginary voice. That should put in perspective the way we feel about the subject. It is a basic mean of delaying the strong coherence of the first good narrative.

Maybe it would be great to have an imaginary Bayesian friend...

Load More