Sorted by New


Kelly *is* (just) about logarithmic utility

A bettor who can make an infinite number of expected profitable bets is going to outperform one who can only make a finite number of bets. 

(any number between 1 and 0 exclusive)^infinity=0, i.e. for an infinite series of bets, the probability of ruin with naive EV maximization is 1. So, expected value is actually -1x your bet size. 

Kelly *is* (just) about logarithmic utility

You're leaving out geometric growth of successive bets. Kelly maximizes expected geometric growth rate. Therefore over enough bets Kelly maximizes expected, i.e. mean, wealth, not merely median wealth. 

What's your best alternate history utopia?

How far back  can I change the institutions? Once you change institutions enough you do change the technology and perhaps the psychology. Something like the Glorious Revolution happening in Rome or any sufficiently centralized ancient state and restoring the Republic on a more sustainable foundation probably results in an early Industrial Revolution.  Technology takes off as soon as people see they can benefit from their investment instead of having it all confiscated by an autocratic state, or by bandits if the state is too weak.

Slavery never getting a foothold in the US probably does not change the technological frontier much, but results in the South being as rich and technologically advanced as the North throughout most of US history, in addition to obviously being freer. Also results in more democratic design of federal government. Probably results in earlier elimination of things like malaria and hookworm from the South, and higher GDP per capita and more secure voting rights there today.   More tentatively, makes the US a more appealing model for post-colonial states to design their governments around, so maybe more of them develop separation of powers and end up democratic and prosperous, and fewer ever become communist.

Contact with reality

I dunno about you but I am basically just a pants-wearing monke. I have an intuitive aversion to the experience machine but I think that is tied to my own experience of life and that my monke brain just does not fully understand the hypothetical being put forward and how radically different it is. In life thus far, and in any version of the scenario that seems plausible to me, there are real hedonic costs, or at least risks of costs, to being deceived. Maybe the machine company steals my organs or forgets to feed me or something while I am deluded and unable to care for myself in the real world. And then either I die, or experience much more pain than I would have. I think if I had perfect certainty that the machine would work as advertised I would take it, but my monke brain thinks it's foolish to have such certainty, and in any real-world situation it is probably right. 


I used to think I held truth as a terminal value but now I think that is not even a meaningful thing to state. To quote a famous movie guy, "YOU CAN'T HANDLE THE TRUTH!!!" Or more fundamwntally, you can't even KNOW the truth. You can know a few limited truths, but without a brain enhancement so serious that it's questionable whether the resulting being is even you anymore, that knowledge is quite limited. We make fun of pigeons and goldfish for being dumb but we can't even see all the colors they see. Even your vision of the milk right in front of you is a simulation, and a pale one that leaves out details so basic they can be understood by a pigeon. Even limiting it to knowledge humans can actually know, the body of knowledge is too large for one individual to know it all, so you are limited to knowing just a few things that interest you. While in a simulation, there really is some combination of particles forming the thing you see, so it is "real" in the sense that it is a thing that exists in the universe, even if not the thing your mind thinks it is. There is a sense in which the world outside the machine is more real, but one human's understanding of that world is so limited anyway that I am not sure the warping done by the machine matters, except insofar as it has external consequences.