Posts

Sorted by New

Wiki Contributions

Comments

"These are people whose utility function does not place a higher utility on 'dieing but not having to take my meds'."

Why are you making claims about their utility functions that the data does not back? Either people prefer less to more, knowingly, or they are making rational decisions about ignorance, and not violating their "ugh" field, which is costly for them.

How is that any different than a smoker being uncomfortable quitting smoking? (Here I recognize that smoking is obviously a rational behavior for people who choose to smoke).

You have it all wrong. Your "ugh" field should go into their utility function! Whether or not they invest the resources to overcome that "ugh" field and save their life is endogenous to their situation!

You are making the case for rationality, it seems to me. Your suggestion may be that people are emotional, but not that they are irrational! Indeed, this is what the GMU crowd calls "rationally irrational." Which makes perfect sense--think about the perfectly rational decision to get drunk (and therefore be irrational). It has costs and benefits that you evaluate and decide that going with your emotions is preferable.

I see this comment as not understanding the definition of "rational" in economics, which would be simply maximizing utility subject to costs such as incomplete information (and endogeneous amounts of information), emotional constraints and costs, etc.

Great idea. Very clever.

Perhaps someone has said this already, but it's worth noting that if you did this in the car dealer example, car dealers could sign similar contracts--your deal would not go through.

Then, negotiating with car dealers would have a game theoretic hawk/dove or snowdrift equilibrium. Similarly with potential wives. They could sign contracts that agree they will never sign prenups--another hawk/dove equilibrium.

Is Samuel Johnson's quote a valid or true statement? I understand your central thrust--the inability to do something personally (such as control one's sexual urges) and the disposition to encourage others to overcome that inability are not necessarily contradictory--indeed, they may fall together naturally.

However, in Samuel Johnson's world, and the world in which this "issue" comes up the most, politics, we might imagine that there exist two types of people: sociopathic individuals hungry for power, and individuals who are sincere.

If sociopathic individuals hungry for power are more often hypocrites, then we might, as an efficient rule of thumb (not being able to distinguish the two save through their observable actions!) condemn hypocrites because they are likely to be power-hungry individuals.

As a bayesian update, in the world of politics, we expect that hypocrites are more likely to be power hungry or sociopathic. I see Samuel Johnson's quote as potentially true, but ignoring a world of imperfect information and signaling.

A reasonable an idea for this and other problems that don't' seem to suffer from ugly asymptotics would simply to mechanically test it.

That is to say that it may be more efficient, requiring less brain power, to believe the results of repeated simulations. After going through the Monty Hall tree and statistics with people who can't really understand either, then end up believing the results of a simulation whose code is straightforward to read, I advocate this method--empirical verification over intuition or mathematics that are fallible (because you yourself are fallible in your understanding, not because they contain a contradiction).

I don't see this as a valid criticism, if it intended to be a dismissal. The addendum "beware this temptation" is worth highlighting. While this is a point worth making, the response "but someone would have noticed" is shorthand for "if your point was correct, others would likely believe it as well, and I do not see a subset of individuals who also are pointing this out."

Let's say there are ideas that are internally inconsistent or rational or good (and are thus not propounded) and ideas that are internally consistent or irrational or bad. Each idea comes as a draw from a bin of ideas, with some proportion that are good and some that are bad.

Further, each person has an imperfect signal on whether or not an idea is good or not. Finally, we only see ideas that people believe are good, setting the stage for sample selection.

Therefore, when someone is propounding an idea, the fact that you have not heard of it before makes it more likely to have been censored--that is, more likely to have been judged a bad idea internally and thus never suggested. I suggest as a bayesian update that, given you have never heard the idea before, it is more likely to be internally inconsistent/irrational/bad than if you hear it constantly, the idea having passed many people's internal consistency checks.