Sorted by New

Wiki Contributions


Thoughts on "Operation Make Less Wrong the single conversational locus", Month 1

Rationality is a combination of keeping your map of the world as correct as you can ("epistemic rationality", also known as "science" outside of LW)

I'm not sure that's what people usually mean by science. And most of the questions we're concerned about in our lives ("am I going to be able to pay the credit in time?") are not usually considered to be scientific ones.

Other than that minor nitpick, I agree.

Avoiding Your Belief's Real Weak Points

If there is a heaven and the killed firstborn went there, then killing them (or anyone else, for that matter) is quite harmless. And killing is wrong for people not because it causes harm, but because God forbids it. It's a strange view, but not an obviously inconsistent one. On the other hand I've always shied away from moral attacks just because the counterargument of "So, God's not benevolent, now what? You still had to worship it for a few decades or you are going to literally burn for eternity" seemed so obvious. Like it seems pointless to argue that Dumbledore is evil when you're trying to prove he never existed.

Anti-reductionism as complementary, rather than contradictory

A lossless explanation is reductionist

Isn't that what people mean when they say reductionism is right?

Wrong however unnamed

I think it's not so much a sum of properties as a union of property sets. If a system has a property that's not a part of a union then it's "more than the sum of its components". On the other hand I find the notion of something being "more than the sum of its parts" about as annoying as the frequent ads with "1 + 1 = 3 Buy two and get one for free!" equation. That is, very annoying.

Is Spirituality Irrational?

It seems interesting that a lot of spiritual experiences are something that happens in non-normal situations. To get them people may try denying food or sleep, stay in the same place for a long time without motion, working themselves to exhaustion, eating poisons, going to a place of different atmospheric pressure or do something else they don't normally try to do. The whole process is suspiciously similar to program testing, when you try the program in some situations its creator (evolution in case of humans) haven't "thought" much about. And then sometimes there are bugs. And if you don't follow the protocols for already discovered bugs you either risk crashing something really important or getting nothing at all. Bugs are real and may give a valuable information on the program's inner workings, but they're not "the final truth about the underlaying reality".

The belief of the revelatory nature of spiritual experiences may be a result of a "just world" bias. When you get your reward you've been working for for years, it's easier to believe you understood something profound about the reality rather than that you've discovered an error in your brain. If that's the case then "If you spin a lot, you'll get vertigo" or "if you sit on your hand long enough, there would be strange feeling there" or "look through the autostereogram picture to see it in 3D" may be thought of as spiritual experiences, but they're too easy and mundane for that.

My Kind of Moral Responsibility

It is possible to talk about utilitarian culpability, but it's a question of "would blaming/punishing this (kind of) person lead to good results". Like you usually shouldn't blame those who can't change their behavior as a response to blame unless they self-modified themselves to be this way or if them being blameless would motivate others that can... That reminds me of the Eight Short Studies On Excuses, where Yvain has demonstrated an example of such an approach.

My Kind of Moral Responsibility

Isn't the question of someone being a good or a bad person at all a part of virtue ethics? That is, for a utilitarian the results of the bystander's and murderer's actions were the same, and therefore actions were as bad as each other, but that doesn't mean a bystander is as bad as the murderer, because that's not a part of utilitarian framework at all. Should we implement the policy of blaming or punishing them the same way? That's a question for utilitarianism. And the answer is probably "no".

Open thread, Apr. 18 - Apr. 24, 2016

But then the difference in intelligence would be almost completely shared + nonshared environment. And twin studies suggest it's very inheritable. It also seems to be a polygenic trait, so there can be quite a lot of new mutations there that haven't yet reached fixation even if it's strongly selected for.

Load More