Followup to: Sunk Cost Fallacy
(expanded from my comment)
"The world is weary of the past—
O might it die or rest at last!"
— Percy Bysshe Shelley, from "Hellas"
Probability theory and decision theory push us in opposite directions. Induction demands that you cannot forget your past; the sunk cost fallacy demands that you must. Let me explain.
An important part of epistemic rationality is learning to be at home in a material universe. You are not a magical fount of originality and free will; you are a physical system: the same laws that bind the planets in their orbits, also bind you; the same sorts of regularities in these laws that govern the lives of rabbits or aphids, also govern human societies. Indeed, in the last analysis, free will as traditionally conceived is but a confusion—and bind and govern are misleading metaphors at best: what is bound as by ropes can be unbound with, say, a good knife; what is "bound" by "nature"—well, I can hardly finish the sentence, the phrasing being so absurd!
Epistemic rationality alone might be well enough for those of us who simply love truth (who love truthseeking, I mean; the truth itself is usually an abomination), but some of my friends tell me there should be some sort of payoff for all this work of inference. And indeed, there should be: if you know how something works, you might be able to make it work better. Enter intrumental rationality, the art of doing better. We all want to better, and we all believe that we can do better...
But we should also all know that beliefs require evidence.
Suppose you're an employer interviewing a jobseeker for a position you have open. Examining the jobseeker's application, you see that she was expelled from four schools, was fired from her last three jobs, and was convicted of two felonies. You ask, "Given your record, I regret having let you enter the building. Why on Earth should I hire you?"
And the jobseeker replies, "But all those transgressions are in the past. Sunk costs can't play into my decision theory—it would hardly be helping for me to go sulk in a gutter somewhere. I can only seek to maximize expected utility now, and right now that means working ever so hard for you, O dearest future boss! Tsuyoku naritai!"
And you say, "Why should I believe you?"
Why should you believe yourself? You honestly swear that you're going to change, and this is great. But take the outside view. What good have these oaths done for all the other millions who have sworn them? You might very well be different, but in order to justifiably believe that you're different, you need to have some sort of evidence that you're different. It's not a special question; there has to be something about your brain that is different, whether or not you can easily communicate this evidence to others with present technology. What do you have besides the oath? Are you doing reasearch, trying new things, keeping track of results, genuinely searching at long last for something that will actually work?
For if you do succeed, it won't have been a miracle: you should be able to pin down at least approximately the causal factors that got you to where you are. And it has to be a plausible story. You won't really be able to say, "Well, I read all these blogposts about rationality, and that's why I'm such an amazing person now." Compare: "I read the Bible, and that's why I'm such an amazing person now." The words are different, but translated into math, is it really a different story? It could be. But if it is, you should be able to explain further; there has to be some coherent sequence of events that could take place in an material universe, a continuous path through spacetime that took you from there to here. If the blog helped, how specifically did it help? What did it cause you to do that you would not otherwise have done?
This could be more difficult than it now seems in your current ignorance: the more you know about the forces that determine you, the less room there is for magical hopes. When you have a really fantastic day, you're more likely to expect tomorrow to be like that as well if you don't know about regression towards the mean.
I'm not trying to induce despair with this post; really, I'm not. It is possible to do better; I myself am doing better than I was this time last year. I just think it's important to understand exactly what doing better really involves.
I feel bad blogging about rationality, given that I'm so horribly, ludicrously bad at it. I'm also horribly, ludicrously bad at writing. But it would hardly be helping for me to just shut up in despair—to go sulk in a gutter somewhere. I can only seek to maximize expected utility now, and for now, that apparently means writing the occasional blogpost. Tsuyoku—