Testing Multiple Forecasters

-

Economic Theory Workshop (2005-2010)
University of Pennsylvania

3718 Locust Walk
395 McNeil

Philadelphia, PA

United States

Joint with: Yossi Feinberg

We consider a cross-calibration test of predictions by multiple potential experts in a stochastic environment. This test checks whether each expert is calibrated conditional on the predictions made by other experts. We show that this test is good in the sense hat a true expert|one informed of the true distribution of the process|is guaranteed to pass the test no matter what the other potential experts do, and false experts will fail the test on all but a small (category one) set of true distributions. Furthermore, even when there is no true expert present, a test similar to cross-calibration cannot be simultaneously manipulated by multiple false experts, but at the cost of failing some true experts.

For more information, contact David Cass.

Colin Stewart

University of Toronto

Download Paper