Previously: "Test Your Forecasting Ability, Contribute to the Science of Human Judgment" (May 2012), "Get Paid to Train Your Rationality" (August 2011)


Think you have what it takes to make good predictions?  Since 2011, the Good Judgment Project (GJP) has been making predictions on issues of international relations and foreign affairs, recently winning the IARPA (Intelligence Advanced Research Projects Activity) prediction contest.  Predictions from the GJP have been startlingly accurate, outperforming prediction markets, and exceeding even optimistic expectations.  It's run by Phillip Tetlock, the famous predictor of "foxes and hedgehogs" fame.

From the Monkey Cage article:

How does the Good Judgment Project achieve such strikingly accurate results? The Project uses modern social-science methods ranging from harnessing the wisdom of crowds to prediction markets to putting together teams of forecasters. The GJP research team attributes its success to a blend of getting the right people (i.e., the “right” individual forecasters) on the bus, offering basic tutorials on inferential traps to avoid and best practices to embrace, concentrating the most talented forecasters into super teams, and constantly fine-tuning the aggregation algorithms it uses to combine individual forecasts into a collective prediction on each forecasting question. The Project’s best forecasters are typically talented and highly motivated amateurs, rather than subject matter experts.


But the good news is that you now have a chance to get involved with GJP Season 3 if you think you're a great predictor:

If you enjoy world politics and appreciate a good challenge, consider joining the Good Judgment Project, which has openings right now for Season 3 forecasters. The Project will give you the opportunity to receive training, to get regular feedback on your forecasting accuracy, and to test your forecasting skills against those of some of the most accurate forecasters around. Interested? To find out more and to register, go to


Also cross-posted on my blog.


New Comment
3 comments, sorted by Click to highlight new comments since: Today at 8:39 PM

I started partecipating, but got turned off by the ridicolously detailed questions outside my area of expertise. Do I think a sack of rice will fall over when the Ethiopian delegation visits Ecuador in March? How sure am I about my prediction? It doesn't seem to help me to achieve better calibration. I'm curious if people that are partecipating are getting value out of it, and what kind of value.

I also stopped participating (partway through season 1) because the questions weren't the sort of thing I was interested in.

Same story for me this season. Check out SciCast, I have much higher hopes.

New to LessWrong?