Tracking forecasts with Brier scoring
Forecasts awaiting resolution
Past forecasts with outcomes
Long-term performance metrics
Lower is better. 0 = perfect, 0.25 = random guessing, 1 = always wrong.
Do my probability estimates match reality?
This is my public prediction tracker. I make forecasts about future events and assign probability percentages. When predictions resolve, they're scored using the Brier Score - a proper scoring rule that rewards well-calibrated predictions.
The Brier Score measures prediction accuracy: (probability - outcome)²
A Brier Score below 0.25 means I'm doing better than random chance. The lower, the better.
When I update a prediction's probability, the change is logged. Click any prediction to see its full history.