Optimization Process


Optimization Process's Shortform

Clearly not all - the extreme version of this is betting on human extinction. It's hard to imagine the payout that has any value after that comes to pass.

Agreed that post-extinction payouts are essentially worthless -- but doesn't the contract "For $90, I will sell you an IOU that pays out $100 in one year if humans aren't extinct" avoid that problem?

Optimization Process's Shortform

Some wagers have the problem that their outcome correlates with the value of what's promised. For example, "I bet $90 against your $10 that the dollar will not undergo >1000% inflation in the next ten years": the apparent odds of 9:1 don't equal the probability of hyperinflation at which you'd be indifferent to this bet.

For some (all?) of these problematic bets, you can mitigate the problem by making the money change hands in only one arm of the bet, reframing it as e.g. "For $90, I will sell you an IOU that pays out $100 in ten years if the dollar hasn't seen >1000% inflation." (Okay, you'll still need to tweak the numbers for time-discounting purposes, but it seems simpler now that we're conditioning on lack-of-hyperinflation.)

Does this seem correct in the weak case? ("some")

Does this seem correct in the strong case? ("all")

Heads I Win, Tails?—Never Heard of Her; Or, Selective Reporting and the Tragedy of the Green Rationalists

(Strong approval for this post. Figuring out how to deal with filtered evidence is close to my heart.)

Suppose that the facts relevant to making optimal decisions about an Issue are represented by nine rolls of the Reality die, and that the quality (utility) of Society's decision is proportional to the (base-two logarithm) entropy of the distribution of what facts get heard and discussed.

Sorry-- what distribution are we measuring the entropy of? When I hear "entropy of a distribution," I think -- but it's not clear to me how to get from there to , , and .

Consider .

Goodhart Taxonomy

Very interesting! I like this formalization/categorization.

Hm... I'd have filed "Why the tails come apart" under "Extremal Goodhart": this image from that post is almost exactly what I was picturing while reading your abstract example for Extremal Goodhart. Is Extremal "just" a special case of Regressional, where that ellipse is a circle? Or am I missing something?