of research on forecasting accuracy conducted fromthrough, Philip Tetlock concluded “the average expert was roughly as accurate as a dartthrowing chimpanzee.
” More worrisome is the inverse correlation between fame and accuracythe more famous a forecasting expert was, the less accurate he was.
This book describes what was learned as Tetlock set out to improve forecasting accuracy with the Good Judgement Project.
Largely in response to colossal US intelligence errors, the Intelligence Advanced Research Projects Activity IARPA was created in.
The goal was to fund cuttingedge research with the potential to make the intelligence community smarter and more effective.
Acting on recommendations of a research report the IARPA sponsored a massive tournament to see who could invent the best methods of making the sorts of forecasts that intelligence analysis make every day.
This tournament provided the experimental basis for rigorously testing the effectiveness of many diverse approaches to forecasting.
And learn they did! Thousands of ordinary citizen volunteers applied, approximately,were invited to participate, and,eventually joined the project.
“Over four years, nearly five hundred questions about international affairs were asked of thousands of Good Judgment Projects forecasters, generating well over one million judgments about the future.
” Because fuzzy thinking can never be proven wrong, questions and forecasts were specific enough that the correctness of each forecast could be clearly judged.
These results were used to compute a Brier scorea quantitative assessment of the accuracy of each forecast for each forecaster.
In the first yearforecasters scored extraordinary well they outperformed regular forecasters in the tournament by.
Remarkably these amateur superforecasters “performed aboutpercent better than the average for intelligence community analysts who could read intercepts and other secret data.
” This is not just luck the superforecasters as a whole increased their lead over all other forecasters in subsequent years.
Superforecasters share several traits that set them apart, but more importantly they use many techniques that we can all learn.
Superforecasters have above average intelligence, are numerically literate, pay attention to emerging world events, and continually learn from their successes and failures.
But perhaps more importantly, they approach forecasting problems using a particular philosophic outlook, thinking style, and methods, combined with a growth mindset and grit.
The specific skills they apply can be taught and learned by anyone who wants to improve their forecasting accuracy.
This is an important book, Forecasting accuracy matters and the track record has been miserable, Public policy, diplomacy, military action, and financial decisions often depend on forecast accuracy, Getting it wrong, as so often happens, is very costly, The detailed results presented in this book can improve intelligence forecasts, economic forecasts, and other consequential forecasts if we are willing to learn from them.
This is as close to a pageturner as a nonfiction book can get, The book is wellwritten and clearly presented, The many rigorous arguments presented throughout the book are remarkably accessible, Sophisticated quantitative reasoning is well presented using examples, diagrams, and only a bare minimum of elementary mathematical formulas.
Representative evidence from the tournament results support the clearlyargued conclusions presented, Personal accounts of individual superforecasters add interest and help create an entertaining narrative, An appendix summarizes “Ten Commandments for Aspiring Superforecasters”, Extensive notes allow further investigation, however the advanced reader edition lacks an index,
Applying the insights presented in this book can help anyone evaluate and improve forecast accuracy.
“Evidencebased policy is a movement modeled on evidencebased medicine, ” The book ends with simple advice and a call to action: “All we have to do is get serious about keeping score.
”
I first heard of this book on CNN's GPS podcast, but the name "Superforecasting" reminded me of "Superfreakonomics", which inturn reminded of dubious smartass hindsights and which caused me to ignore the recommendation.
Tetlock was cited again by Steven Pinker in his book "Enlightenment Now" and that finally got me to pick it up.
Can you really forecast geopolitical events Surprisingly yes,
Do you need a special ability to be a "superforecaster" Not really,
What then do you need
The book describes the methods used by superforecasters and in doing so describes a number of systemic biases in our thinking.
Also, there are many relevant examples and except for a couple of complex equations which can be ignored, the author makes his points really well.
This was a fun, fast read that was also satisfying,
To the author's credit, he has finally made me pick up sitelinkThinking, Fast and Slow which I already think will be lifechanging as far as books and ideas can be.
This book was solid, though perhaps not quite as good as I hoped/expected, It was interesting reading, full of interesting stories and examples, The author doesn't prescribe a particular method superforecasting, it appears, is more about a toolbox or set of guidelines that must be used and adapted based on the particular circumstances.
As a result, at times I felt the author's thread was being lost or scattered however, upon reflection I realized it was part of the nature of making predictions.
On reflection, his guidelines are clear and should be helpful, even if they cannot provide a method for correct predictionsof the time.
One critique I had was that the author didn't provide any statistical evidence of why the people he identified as superforecasters were good as
opposed to lucky.
I continued to think some of the examples he gave were based on luck, not necessarily skill the author distilled a lesson that contributed to the success, but I would have had more confidence that his conclusion represented the reason for the superforecasters' success if he had provided more statistical evidence to support that conclusion.
Nonetheless, his conclusions/guidelines appear sound, and I plan on using them, PT's Superforecasting correctly remarks upon the notable failure to track the performance of people who engaged in predicting the outcome of political events.
This lack of accountability has led to a situation where punditry amounts little more than entertainment extreme positions offered with superficial, onesided reasoning aimed mainly at flattering the listeners' visceral prejudices.
One problem is expressed positions are deliberately vague, This makes it easy for the pundit to later requalify his position conform with the eventual outcome.
For example: a pundit claims quantitative easing will lead to inflation, When consumer inflation doesn't appear, he can claim thatit will, given enough time,in fact, there is inflation in stock pricesHe never said how much inflation.
Thus, the first task in assessing performance is to require statements of clearly defined, easily measurable criteria.
Once this is done, PT began a series of experiments, testing which personality characteristics and process variables led to good prediction outcomes, both for individuals and groups.
Key attributes include independence from ideology, an openness to consider a variety of sources and points of view and a willingness to change one's mind.
Native intelligence, numeracy and practical experience with logical thinking all correlate positively with prediction accuracy at least to a point.
But moderately intelligent and diligent individuals can often surpass the super bright, who sometimes show a tendency to be blinded by their own rhetorik.
And some "superforecasters" consistently outperform professionals with access to privileged information, The chapter on how to get a group to function well together is especially applicable for business management.
PT wrote his book at a mid brow style, and anyone already familiar with basic psychology writing, e.
g. from D Kahneman, will often feel annoyed by his long and overly folksy explanations, Indeed, while it has good things to say about applied epistemology, it isn't necessary read all allpages.
A good alternative starting point would be to consult Evan Gaensbauer's review at the Less Wrong website: sitelink lesswrong. com/posts/dvYeS . Great look at the biases we fall prey to when attempting to predict and understand what the hell is going to happen.
Most people suck at prediction, but the absence of accountability means there is no incentive to improve.
And in many situations, incentives push against accuracy in favor of defensible wrongness,
Feels a bit dated already, as the convergence between tech and predictions is absent.
There's a great book waiting to be written that takes all the strengths of Superforcasting and combines them with an in depth knowledge of Machine Learning and Predictive Modeling.
Overall, this book delivers solid research, connects that to strong conclusions, and presents actionable insights to improve your thinking.
All while being delightfully self aware, Highly recommend to anyone who likes to metacognate, I usually rank my favorite books on a line between extremely readable“ and very useful“, This one is probably among my Topmost useful books ever, The other two are Kahneman's “Thinking, Fast and Slow”, and Taleb's “Black Swan”.
You don't have to agree on everything with the author, but you still will get dozens of truly important facts that can fundamentally affect your life.
Don't be misguided by the title you really have to read this book even if you don't have the ambition to predict stock prices or revolutions in the Arab world.
Whether we like it or not, we are all forecasters making important life decisions such as changing career path or choosing a partner based on dubious, personal forecasts.
This book will show you how to dramatically improve those forecasts based on data and experience of the most successful forecasters.
Youll be surprised that those experts usually arent CIA analysts or skilled journalists, but ordinary intelligent people, who knows how to avoid most common biases, suppress their ego and systematically assess things from different angles.
We will never be able to make perfect predictions but at least we can learn from the very best.
.
Pick Up Superforecasting: The Art And Science Of Prediction Articulated By Philip E. Tetlock Issued As EPub
Philip E. Tetlock