Let's say we're having a prediction contest where we want people to pre-register their probability estimates for the outcomes of a set of discrete events (e.g. the candidates for various races in an upcoming election). We have a fixed sum of, say, $100 that we must award to the winner(s) in some way. We are not allowed to award any more or less than that. How can we determine the award in such a way as to incentivize people to honestly report their estimates?

The naïve approach would be to do the following:

  1. Score everyone's predictions according to some proper scoring rule
  2. Award the $100 to whoever has the highest score.

However, I'm pretty sure this doesn't work. A proper scoring rule incentivizes honest estimates if the goal is to maximize your total score; but in this case, maximizing your score is not the goal. Under this system, I'm incentivized to make estimates that may have a suboptimal expected score as long as the expectation is high-variance, because that will increase the probability that I get the top spot.

Can you figure out a system that avoids this problem?

(Distributing the prize randomly or in partial shares is allowed. However, destroying money or creating extra money is not.)

New Answer
New Comment

2 Answers sorted by

OK, so if I understand this correctly, the proposed method is:

  1. For each question, determine the log score, i.e. the natural logarithm of the probability that was assigned to the outcome that ended up happening.
  2. Find the total score for each contestant.
  3. For each contestant, find e to the power of his/her total score.
  4. Distribute the prize to each contestant in a fraction proportional to that person's share in the sum of that number across all contestants.

(Edit: I suppose it's simpler to just multiply all of each contestant's probabilities together, and distribute the award proportional to that result.)

3Throwaway23672y
yes

Dagon

Nov 02, 2022

20

What do you want to reward?  Honest estimates or accurate predictions?  Remember that probability (at least at this level) is in the map, not the territory - there IS no definition of "correct probability" after the result is known.  In other words, things that happen are probability 1, things that don't are probability 0.

If you care about prediction ability, go with the naive mechanism - reward whoever's right.  You can still get an overall probaility because people should predict based on their probability estimate AND the number of people they'll share with if that side wins.  So if I think there's a 95% chance of "yes", but 99/100 of other predictors are saying "yes", I should predict "no", because 5% of $50 is better than 99% of 1.01.  This will move the aggregate closer to our average beliefs.

If you care about honesty, it's trickier.  You might be able to get there with some amount of hedging or quantity-wagering.  Make the payout split among everyone, weighted by their credence in that outcome.  So if "yes", someone who picked 75% gets 50% more reward than someone who picked 50% (but if it's "no", they get half as much reward for 75% than 50%).

 

Weighted by credence means you're scored on Probability*Prediction, which isn't a fair rule.
If I sincerely believe its 60:40 between two options, and I write that down, I expect 36+16=52 payout, but if I write 100:0 I expect 60+0=60 payout, putting more credence on higher probability outcomes gets me more money even in excess of my true beliefs.