All of Student_UK's Comments + Replies

Rewriting the sequences?

So, is he defending one of these positions, or arguing against them all. Or saying the whole debate is pointless?

From what I read he seems to be suggesting that truth is independent of what we believe, but I'm not sure what else he is saying, or what his argument is.

9Liron10yHere are the main points I understood: The only way you can be sure your mental map accurately represents reality is by allowing a reality-controlled process to draw your mental map. A sheep-activated pebble-tosser is a reality-controlled process that makes accurate bucket numbers. The human eye is a reality-controlled process that makes accurate visual cortex images. Natural human patterns of thought like essentialism and magical thinking are NOT reality-controlled processes and they don't draw accurate mental maps. Each part of your mental map is called a "belief". The parts of your mental map that portray reality accurately are called "true beliefs". Q: How do you know there is such a thing as "reality", and your mental map isn't all there is? A: Because sometimes your mental map leads you to make confident predictions, and they still get violated, and the prediction-violating thingy deserves its own name: reality.
Rewriting the sequences?

By the way,

I still haven't heard an explanation of what "The Simple Truth" is about. Maybe that requires a whole separate post.

1MrMind10yHere [http://lesswrong.com/lw/nv/replace_the_symbol_with_the_substance/] Elizier said:
0[anonymous]10yYou may need to read and understand something like the Truth entry at the Stanford Encyclopedia of Philosophy [http://plato.stanford.edu/entries/truth/] to grok the context and meaning of The Simple Truth.
Rewriting the sequences?

That helps explain a bit more why they are the way they are. But it suggests to me that they shouldn't play such a prominent role on the site, because they haven't been designed for the purpose they are now being used for.

Rewriting the sequences?

Thanks, this is a great suggestion, I think this would be more helpful.

What does your web of beliefs look like, as of today?

"The normativity of logic is: “If you want to be speaking the same language as everyone else, don’t say things like ‘The ball is all green and all blue at the same time in the same way.’”"

You surely don't mean this: everyone one else is logical, why not me?

For a start, is everyone else logical? And even if they are, is that the best justification we have for logic?

2lukeprog10yI don't understand your question. Logic begins with a chosen set of axioms, and they're not the only axioms you could choose as the basis of a formal system. If you reject the axioms, I can't condemn you for failing a categorical imperative. Instead, I'll just note that you're not talking the same language as the rest of us are.
What does your web of beliefs look like, as of today?

"But philosophers still argue about ... theism ... as if these weren’t settled questions."

If this is really what you think, then why do you continue with your blog?

9lukeprog10yFor outreach, mostly. And for occasional curiosity. But really, philosophy of religion is thunderously boring to me now. I have to force myself to churn out the occasional post on religion to avoid losing 95% of my audience, but if I stop caring about audience size, you'd probably see me write one post on religion every 6 months. :) Already, I only post once on religion each week, usually, whereas in the past it was usually 5 times a week about religion.
Do biases matter?

You only have no time to think if your main priority is winning the prize. If you are interested in holding true beliefs then you can take longer. However, our current system tends to reward those that get there first, not those who maximize their chances of being correct.

0prase10yDepends on situation, and overall, your description of choosing between bias and success sounds like a false dilemma. Your scenario assumes that 1) reading the clues you find out the location of the prize sooner than if you just waited until someone finds it, but 2) after you found out where the prize is you would have no time to get it physically, and meanwhile 3) you want to get the prize and simultaneously 4) you want to learn where the prize is as soon as possible. I wonder how frequent such situations are (I don't think it applies to science), but nevertheless 3 and 4 are in a conflict given 1 and 2. Conflict of priorities is hardly something unexpected, and rational advice is straightforward: decide whether you want more 3 or 4, if 3, take one door and run, if 4, study the clues. Are you suggesting that decision about your priorities needs bias, or that prioritising 3 is bias? In either case, you would be using the word in a non-standard way. LW wiki defines bias as a specific, predictable error pattern in the human mind. The word error is important. Bias isn't a catch-all category which includes everything that retards knowledge.
Do biases matter?

Ok. Clearly you only read the title, and not my actual post. I didn't say no biases matter, just that they might not always be a bad thing.

0Manfred10yAh, sorry, I'd assumed that, though you talked about other things in your post, you still wanted attempted answers to the question in the title. EDIT: also note that this provides a nice test of the value of heuristics with biases. If everyone on earth had them instead of didn't, would it be more valuable than a few million Africans?
0Desrtopa10yBiases may not always be a bad thing, but you can't tell whether they're good in any specific case without comparing them objectively to an unbiased position. You can't skip straight to second order rationality without employing first order rationality first [http://lesswrong.com/lw/je/doublethink_choosing_to_be_biased/]. If biases are bad on average, then as a rule you're generally better off assuming that it's better not to preserve your bias.
Do biases matter?

Of course. Most of it will be in the wrong direction, that's the point. It might not be best for you, but maybe it will be the best thing for the group.

3falenas10810ySorry, should have been clearer. There are examples where scientists have had incorrect theories that science has accepted, which has set back scientific progress for decades. This may not be due to running with it, maybe they did give their ideas a great deal of thought before writing about them, so your point may still be valid.
Do biases matter?

Sure. In reality it is still going to require some narrowing down. But once you have reduced it to a few cases the best thing might be to just guess.

Settled questions in philosophy

I think it is doubtful that any of the examples that you give have been solved/settled in any meaningful sense (particularly the last one).

If they were settled then those who disagree would have to be one of the following:

  • Unknowledgeable about the subject matter
  • Confused about the subject matter
  • Lying
  • irrational

However, there seem to be plenty of people who disagree with you who don't fall into the above categories.

From what you say it just sounds like you are saying that these are the issues that you are convinced about and which you cannot imagine be... (read more)

Rational = true?

"I think the rational is the closest to true you can possibly get from where you are."

The truth is the closest you can get to the truth. Suppose Rob reads the newspaper but then believes that City won because their his team and it would make him happy if they won. His belief would be closer to the truth, but it would not be rational.

1Alexandros10yThat can be patched with editing to "the rational is the closest to true you can justifiably get from where you are". I'm taking a page from the definition of knowledge as 'justified true belief'. The belief that his team won would be true but not justified. Just as a broken clock is right twice a day but that still doesn't make it a reliable time measurement apparatus.
Philosophers and seeking answers

That depends what the conclusion is supposed to be. If it is just that philosophers X, Y and Z are wrong, then you are right - he can simply bring forward arguments a, b and c to show this.

However, his claim is stronger than that. He is claiming that these philosophers (or at least many of them) are not in the truth business. His philosophical arguments may show that the other philosophers are wrong, but it won't follow that they are not sincere in trying to find answers and solve problems. For that he needs something like: they can't really be trying to find the truth, otherwise they would agree with me (at least on these "simple" matters).

Philosophers and seeking answers

I'm not sure what you're worried about. Just as you can't force people to move on from a problem that you think has been solved, so too they can't force you to wait while they work it out.

In the early modern period various thinkers were asking questions that would ultimately lead to the foundations of modern science (I'm thinking of Francis Bacon, Rene Descartes, John Locke and others). Philosophers have continued worrying about a lot of these issues (problem of induction, demarcation) while the scientists have moved on and made many fruitful discoveries. ... (read more)

1roystgnr10yIndeed, much of what we think of as "modern science" used to be called "natural philosophy". Even "logic" used to be considered the realm of philosophers rather than mathematicians. Philosophy may be maligned in part due to a linguistic selection bias where, as soon as we start to really understand a subject, we stop calling it "philosophy".
5Daniel_Burfoot10yThomas Kuhn agrees with you: from "The Structure of Scientific Revolutions"
Statistical Prediction Rules Out-Perform Expert Human Judgments

I have two concerns about the practical implementation of this sort of thing:

  1. It seems like there are cases where if a rule is being used then people could abuse it. For example, in job applications or admissions to medical schools. A better understanding of how the rule relates to what it predicts would be needed.

If X+Y predicts Z does that mean enhancing X and Y will up the probability of Z? Not necessarily, consider the example of happy marriages. Will having more sex make your relationship happier? Or does the rule work because happy couples tend to... (read more)

-4BillyOblivion10y[quote]Will having more sex make your relationship happier? [/quote] Having more sex will make ME happier. If my wife finds out though...
1wedrifid10yYes. Almost certainly. But there are plenty of other examples you could pick from where there is not causality involved (and some for which causality is negative).
3cousin_it10yI think it's safe to say that having less sex will make the relationship less happy, so there is some causality involved.
7MatthewW10yYes, several of these models look like they're likely to run into trouble of the Goodhart's law type ("Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes").
9shokwave10yObviously, yes.
Note on Terminology: "Rationality", not "Rationalism"

Does the same reasoning apply to all -isms? Empircism, materialism, internalism and externalism to name a few.

ism has a few different uses it can indicate a group of principles (empiricism), or a political movement (socialism), or a type of discrimination (sexism).

Your worry seems to be that "rationalism" looks like a political movement, but that sort of thing is more likely to be determined by how the people who use the term of themselves act. And that problem does not go away by refusing to use certain words. If people who call themselves ratio... (read more)

0timtyler10yDarwinism, empircism and materialism don't sound too dogmatic to me - probably because I am used to them. Established dogma can get away with being an -ism, I figure.