Sorted by New

Wiki Contributions


I think a better way to frame this issue would be the following method.

  1. Present your philosophical thought-experiment.
  2. Ask your subject for their response and their justification.
  3. Ask your subject, what would need to change for them to change their belief?

For example, if I respond to your question of the solitary traveler with "You shouldn't do it because of biological concerns." Accept the answer and then ask, what would need to change in this situation for you to accept the killing of the traveler as moral?

I remember this method giving me deeper insight into the Happiness Box experiment.

Here is how the process works:

  1. There is a happiness box. Once you enter it, you will be completely happy through living in a virtual world. You will never leave the box. Would you enter it?
  2. Initial response. Yes, I would enter the box. Since my world is only made up of my perceptions of reality, there is no difference between the happiness box and the real world. Since I will be happier in the happiness box, I would enter.
  3. Reframing question. What would need to change so you would not enter the box.
  4. My response: Well, if I had children or people depending on me, I could not enter.

Surprising conclusion! Aha! Then you do believe that there is a difference between a happiness box and the real world, namely your acceptance of the existence of other minds and the obligations those minds place on you.

That distinction was important to me, not only intellectually but in how I approached my life.

Hope this contributes to the conversation.


I believe that you have an unexamined assumption in your post. Namely, can you have any effect on what your child believes?

A book by Judith Rich Harris called "The Nurture Assumption" makes the case that it is not parents that shape a child's attitudes and beliefs, but the child's peers. Parents impact on children tends to be primarily genetic and in the basics (no abuse, well fed and clothed, and choosing the general environment where the child is raised.) For a more detailed look on Harris's argument, see Malcolm Gladwell's article at .

If you agree with Harris's argument, you might rethink your attitude towards your child's belief in God or Santa Claus. They will probably make up their own mind about it despite your best efforts.

Hope this contributes to the discussion.


This "trying to believe" tactic is much more explicitly used in areas where there is randomness or unpredictability.

My business is finance. As a financial advisor, I am constantly "trying to believe" in things like regression to the mean, long term performance of the market vs. short term volatility, the efficacy of asset allocation.

But each day I am faced with evidence that causes me to doubt my rationally held beliefs about investing.

I think baseball players may have similar issues with batting. They may rationally know that it's only practice and talent that improve their performance, but they still notice that when they where the red underwear they hit better. So they may be "trying to believe" that the red underwear doesn't really affect their batting behavior.

As with many of the issues we raise, this all boils down to having a brain that's made of multiple systems each trying to do something a little different. We have a pattern matching part of our brain and we have our prefrontal cortex, theorizing about the world.

Sometimes these systems can be in conflict.

Hope this contributes to the discussion.


The haunted rationalist is probably an example of a physiological response tied to a shared cultural delusion. In strange places, when we are alone, we often can feel nervous or fearful. I remember feeling this way when I was alone in our church growing up, or when I was alone in our own house for the first time as a teenager.

There's probably a physiological reason for this. Perhaps we produce more adrenaline when left alone after a period of close cohabitation with others. This would be a useful evolutionary trait allowing us to be more aware of our surroundings.

Tie this initial physiological reaction in with "shared cultural delusions" such as ghosts and you may get a negative feedback loop going. Small amount of additional adrenaline produced leads our body to be more attuned to "fight or flight" situations. We hear unfamiliar noise. Our minds involuntarily tie in shared cultural delusion, and our bodies produce more adrenaline. Before you know it, you're running out of the mansion screaming.

In hotels, perhaps two things dampen this. 1) We don't feel alone. We know other humans are nearby. 2) We can easily explain unfamiliar noises, cutting out feedback loop. Despite this, people have much harder times sleeping in hotel rooms than at home. This may be evidence of the initial physiological reaction I mentioned before.

Interestingly enough, there's another area that this occurs. Alien abductions and sleep paralysis. See .

Hope this helps the discussion.


I don't think that the "effort" distinction is banal at all.

The "lying" scenario provides us with much more information about the "liar", than the "keeping secrets" scenario provides us about the "secret keeper". Let me go into this in more detail.

An individual assumes that others have mental states, but that individual has no direct access to those mental states. An individual can only infer mental states through the physical actions of another.

For now, let's assume that an individual who can more accurately infer others mental states from their actions will be "happier" or "more successful" than an individual who cannot.

So, given this assumption, every individual has an incentive to constantly determine others mental states, generalize this into some mental stance, and relate that mental state and mental stance back to the individual.

With these brief preliminaries out of the way, let's examine "lying" vs "secrets".

When a person gives you misinformation, the potential liar takes an active role in trying to affect you negatively. The range of potential mental states and mental stances from this information is relatively small. The person can have a mental stance of "looking out for your best interests" (let's call this mental stance "friendliness") and be mistaken, or the person can have a mental stance of "trying to manipulate you" and be lying. The pathway to determine whether a person is "mistaken" or "lying" is relatively straightforward (compared to secrets), and if we can determine "lying" we can take action to change our relationship with the other.

When a person withholds information that may be helpful; however, we have a much stickier situation. The range of potential mental states is much broader in this situation. The person may be unsure of the accuracy of the information. The person may be unsure of the efficacy of the information to you. The person may be unsure of your willingness to receive this information. In other words, there are many reasons a person may refrain from giving you potentially helpful information and still have a mental stance of "friendliness".

And it would be hard to prove that the withholder of information actually has a mental stance of "eneminess".

Thus, when someone withholds information, our line of inquiry and our course of actions are far less clear than when a person gives us misinformation.

So, in summary, the asymmetry between the two situations is an asymmetry of information. The fact than an individual takes an effort to "lie" to us give us a great deal more information about that individual's mental stance towards us. The person who "keeps a secret" on the other hand, has not given us information about their mental stance towards us.

Hope this helps provoke discussion.