"A Technical Explanation of Technical Explanation" (http://yudkowsky.net/rational/technical) defines a proper rule for a betting game as one where the payoff is maximized by betting an amount proportional to the probability of success.


The first example rule given is that the payoff is one minus the negative of the squared error, so for example if you make a bet of .3 on the winner, your payoff is 1-(1-.3)^2 = .51. 


This doesn't seem like a good example. It works if there are only two options, but I don't think it works if there are three or more. For example, we imagine P(red) = .5, P(blue) = .2, P(green) = .3.  If we place bets of .5, .2, and .3 respectively, the expected return I get is .6. (edit: Fixed a mistake pointed out by Douglas Knight.)


However, if I place bets of .51, .19, .3 the expected return is .60173. I have that the condition for maximization is


(1-R)P(R) = (1-B)P(B) = (1-G)P(G),


which I got by taking partial derivatives of the expectation and setting them equal. ("R" stands for the bet placed on red and "P(R)" for the probability of red, etc.) This is different than simply R=P(R), etc.


So does the article have a mistake, or do I, or did I miss part of the context?

New Comment
4 comments, sorted by Click to highlight new comments since: Today at 7:37 AM

The quadratic scoring rule given by EY in the article only works for binary events. He doesn't discuss multiclass quadratic scoring rules because he advocates the log scoring rule, which doesn't need to be modified when you add more events.

In your three event example, a quadratic score would be something like 3 - (y_r -R)^2 - (y_b - B)^2 - (y_g - G)^2, where y_i = 1 if that color shows up and 0 otherwise.

That sounds better, thank you.

I agree with your calculus. In case EY was only talking about the binary case or made a mistake, I checked Wikipedia, which agrees* with your definition of the quadratic scoring rule and claims that it is strictly proper. It also suggests the spherical scoring rule, where the return for an outcome that you weighted p is p/sqrt(sum of p_i^2), which does seem to work analytically.

However, I don't get the same numbers as you. I get the expected return of .5,.2,.3 as 0.6 and the return of .51,.19,.3 as 0.60173. (but the sign is the same, so it makes little difference)

* Edit: no, actually it does not agree. That's the problem.

Thanks. I realize now I calculated those numbers I cited while leaving out the payoff from the .3 option since it wasn't changing, then forgot to add it back in. Strange what Wikipedia says when there was this counterexample. If I have some time later I'll check through the sources linked in the article.