jscn
Message
97
30
MixedNuts's comment reminded me of a good resource for such techniques, and, indeed, for generally improving one's effectiveness at reading: How To Read A Book
It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so.
-- Mark Twain
Clearly Dennett has his sources all mixed up.
Voted up mainly for the Greg Egan recommendations.
But the problem is worse than that because "Sometimes, crows caw" actually does allow you to make predictions in the way "electricity!" does not.
The problem is even worse than that, because "Sometimes, crows caw" predicts both the hearing of a caw and the non-hearing of a caw. So it does not explain either (at least, based on the default model of scientific explanation).
If we go with "Crows always caw and only crows caw" (along with your extra premises regarding lungs, sound and ears etc), then we might end up wit...
Huh, I thought there was a fair bit of evidence around showing that people perform basically just as badly on tests which exploit cognitive biases after being told about them as they do in a state of ignorance.
I found Drive Yourself Sane useful for similar reasons.
I've been meaning to take a stab at Korzybski's Science and Sanity (available on the interwebs, I believe) for a while, but I've heard it's fairly impenetrable.
It's a wonderful thing to be clever, and you should never think otherwise, and you should never stop being that way. But what you learn, as you get older, is that there are a few million other people in the world all trying to be clever at the same time, and whatever you do with your life will certainly be lost - swallowed up in the ocean - unless you are doing it with like-minded people who will remember your contributions and carry them forward. That is why the world is divided into tribes.
-- Neal Stephenson, The Diamond Age
I neglected to record from which character the quote came.
Rationality is highly correlated intelligence
According to research K.E. Stanovich, this is not the case:
Intelligence tests measure important things, but they do not assess the extent of rational thought. This might not be such a grave omission if intelligence were a strong predictor of rational thinking. But my research group found just the opposite: it is a mild predictor at best, and some rational thinking skills are totally dissociated from intelligence.
The classic example of riding a bicycle comes to mind. No amount of propositional knowledge will allow you to use a bike successfully on the first go. Theory about gyroscopic effects of wheels and so forth all comes to nothing until you hop on and try (and fail, repeatedly) to ride the damn thing.
Conversely, most people never realise the propositional knowledge that in order to steer the bike left, you must turn the handle bars right (at least initially and at high speeds). But they do it unconsciously nonetheless.
But once procedural knowledge is had, it also incorporates things like body memory and pure automatic habit, which, when observed in oneself, are just as likely to be rationalized after the fact as they are to be antecedently planned for sound reasons. It's also easy to forget the initial propositions about a mastered procedure.
I've also noticed this kind of thing in my martial arts training.
For instance, often times high level black belts will be incredibly successful at a particular technique but unable to explain the procedure they use (or at least,...
This tendency can be used for good, though. As long as you're aware of the weakness, why not take advantage of it? Intentional self-priming, anchoring, rituals of all kinds can be repurposed.
Most of these bad Philosophers were encountered during the few classes I took to get a Philosophy minor.
Initially I thought you were talking about professional Philosophers, not students. This clears that up, but it would be better to refer to them as Philosophy students. Most people wouldn't call Science undergrads "Scientists".
My experience with Philosophy has been the opposite. Almost all the original writing we've read has been focused on how and why the original authors were wrong, and how modern theories address their errors. Admittedly,...
I would guess that it's because comments are shorter and tend to express a single idea. Posts tend to have a series of ideas, which means a voter is less likely to think all of them are good/worthy of an upvote.
Thirded. I completed half of my degree in CS before switching to Philosophy. I'm finding it significantly more stimulating. I don't think I learned anything in my CS classes that I couldn't easily have taught myself (and had more fun doing so).
According to this post, doing so would be "against blog guidelines". The suggested approach is to do top-level book review posts. I haven't seen any of these yet, though.
That sorted it, thanks.
Having recently received a couple of Amazon gift certificates, I'm looking for recommendations of 'rationalist' books to buy. (It's a little difficult to separate the wheat from the chaff.)
I'm looking mainly for non-fiction that would be helpful on the road to rationality. Anything from general introductory type texts to more technical or math oriented stuff. I found this OB thread which has some recommendations, but I thought that:
Nothing terrible will happen to Wednesday if she deconverts
The terrible thing has already happened at this stage. Telling your children that lies are true (i.e., that Mormonism is true), when they have no better way of discerning the truth than simply believing what you say, is abusive and anti-moralistic. It is fundamentally destructive of a person's ability to cope with reality.
I have never heard a story of deconversion that was painless. Everyone I know who has deconverted from a religious upbringing has undergone large amounts of internal (and often...