# All of simonsimonsimon's Comments + Replies

it's not intuitive to me when it's reasonable to apply geometric rationality in an arbitrary context.

e.g. if i offered you a coin flip where i give you $0.01 with p=50%, and$100 with q=50%, i get G =  = $1, which like, obviously you would go bankrupt really fast valuing things this way. in kelly logic, i'm instead supposed to take the geometric average of my entire wealth in each scenario, so if i start with$1000, I'm supposed to take  = \$1048.81, which does the nice, intuitive thing of penalizing me a little vs. l...

3quetzal_rainbow10mo
I think the rule is "you maximize your bank account, not the addition to it". I.e. your value of deals depends on how many you already have.
8Vivek Hebbar10mo
Another way of looking at this question:  Arithmetic rationality is shift invariant, so you don't have to know your total balance to calculate expected values of bets.  Whereas for geometric rationality, you need to know where the zero point is, since it's not shift invariant.

We need to align the performance of some large task, a 'pivotal act' that prevents other people from building an unaligned AGI that destroys the world.

What is the argument for why it's not worth pursuing a pivotal act without our own AGI? I certainly would not say it was likely that current human actors could pull it off, but if we are in a "dying with more dignity" context anyway, it doesn't seem like the odds are zero.

My idea, which I'll include more as a demonstration of what I mean than a real proposal, would be to develop a "cause area" fo...

On the off chance we spend some time in a regime where preventable+detectable catastrophic actions might be attempted, it might be a good idea to somehow encourage the creation of a Giant Alarm which will alert previously skeptical experts that a catastrophe almost occurred and hopefully freak the right people out.

The steelman version of flailing, I think, is being willing to throw a "hail mary" when you're about to lose anyway. If the expected outcome is already that you die, sometimes an action with naively negative value but fat tails can improve your position.

If different hail mary options are mutually exclusive, you definitely want to coordinate to pick the right one and execute it the best you can, but you also need to be willing to go for it at some point.