- Hardcover: 352 pages
- Publisher: Signal (Sept. 29 2015)
- Language: English
- ISBN-10: 0771070527
- ISBN-13: 978-0771070525
- Product Dimensions: 16.5 x 3 x 24.5 cm
- Shipping Weight: 717 g
- Average Customer Review: 26 customer reviews
Amazon Bestsellers Rank:
#7,853 in Books (See Top 100 in Books)
- #3 in Books > Business & Investing > Management & Leadership > Planning & Forecasting
- #3 in Books > Professional & Technical > Business Management > Management & Leadership > Planning & Forecasting
- #34 in Books > Professional & Technical > Professional Science > Behavioural Sciences > Cognitive Psychology
Superforecasting: The Art and Science of Prediction Hardcover – Sep 29 2015
|New from||Used from|
Frequently bought together
Customers who bought this item also bought
No Kindle device required. Download one of the Free Kindle apps to start reading Kindle books on your smartphone, tablet, and computer.
To get the free app, enter your mobile phone number.
• "Tetlock's work is fascinating and important, and he and Gardner have written it up here with verve." --The Financial Times
• "Superforecasting is the most important scientific study I've ever read on prediction." --The Bloomberg View
About the Author
PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, and the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics.
DAN GARDNER is a journalist and the author of Risk: The Science and Politics of Fear and Future Babble: Why Pundits are Hedgehogs and Foxes Know Best. He resides in Ottawa.
What other items do customers buy after viewing this item?
Top customer reviews
What I really like about this book is that it contains lessons on how to think analytically. In my view, these lessons can be transposed to a variety of contexts. For example, I think the book could be useful to a lawyer who wants to better predict the outcomes of cases, or a manager who wants to better understand the effect of certain policies or strategies.
I suppose if you don't have an interest in the subject, the book might be difficult to get through. I personally have an interest in the subject and found the subject matter to be quite interesting, and consequently I found the book to be extremely readable. I don't know if the book will improve my forecasting skills, but I do think it helped me to refine my thinking - and in any event, it was an entertaining read. Highly recommended.
Note: This insight goes back at least as far as Herbert Simon’s 1960 essay, “The Corporation, Will it be Managed by Machines?,” in Management and the Corporations, M. L. Anshen and G. L. Bach, eds., 1985, New York: McGraw-Hill, pp. 17–55, Print.
As I began to read Superforecasting, I was again reminded of the fact that so-called "experts" working with a computer tend to make better predictions that can either a computer or another human being (or group) working without one. As was the case with Philip Tetlock's previously published book, Expert Political Judgment: How Good Is It? How Can We Know?, he and Dan Gardner have collaborated on a book that is evidence-driven rather than theory-driven. That's a key point. (Please see pages 291-328.) I agree with another reviewer, Dr. Frank Stechon, who suggests that Tetlock shows conclusively two key points: First, the best experts in making political estimates and forecasts are no more accurate than fairly simple mathematical models of their estimative processes. This is yet another confirmation of what Robyn Dawes termed "the robust beauty of simple linear models." The inability of human experts to out-perform models based on their expertise has been demonstrated in over one hundred fields of expertise over fifty years of research; one of the most robust findings in social science.
Tetlock and Gardner are convinced -- and I agree -- that "we will need to blend computer-based forecasting and subjective judgment in the future. So it's time to get serious about both." Obviously superior judgment by an individual or group blended with superior technology is the ideal combination. In one of Tom Davenport’s recent books, Judgment Calls, he and co-author Brooke Manville offer “an antidote for the Great Man theory of decision making and organizational performance”: [begin italics] organizational judgment [end italics]. That is, “the collective capacity to make good calls and wise moves when the need for them exceeds the scope of any single leader's direct control."
These are among the several dozen passages of greatest interest and value to me in Chapters 1-7, also listed to suggest the scope of Tetlock and Gardner’s coverage:
o The Skeptic (Pages 6-10)
o The Optimist (10-20)
o Blind Men Arguing (25-30)
o Thinking About Thinking (33-39)
o Blinking and Thinking (41-45)
o Judging Judgments (52-65)
o Expert Political Judgment, and, And the Results... (66-72)
o Resisting Gravity -- But for How Long? (96-104
o Fermi-Ize (110-114)
o Outside First (117-120)
o Thesis, Antithesis, Synthesis (121-124)
o Where's Osama? (130-134)
o Probability for the Stone Age (137-140)
o Probability for the Information Age (140-143)
o But What Does It All Mean? (147-152)
o The Over-Under (156-158)
o Under, and, Over (159-166)
As indicated, the information, insights, and counsel that Philip Tetlock and Dan Gardner provide in this volume are based on rigorous and extensive research with regard to the art and science of forecasting. While re-reading the book prior to setting to work on this brief commentary, I first re-read the Appendix, "Ten Commandments for Aspiring Superforecasters," and presume to suggest that those about to read the book for the first time do the same (I wish I had) because this material provides a superb framework, a context and frame of reference, for the lively and eloquent narrative developed within twelve substantial chapters. Here are the concluding remarks: "Guidelines [not predictions] are the best we can do in a world where nothing is certain or exactly repeatable. Superforecasting requires constant mindfulness, even when -- perhaps especially when -- you are dutifully trying to follow these commandments."
In this context, I am again reminded of these words of caution expressed by Nassim Nicholas Taleb in The Black Swan: The Impact of the Highly Improbable: “It has been more profitable for us to bind together in the wrong direction than to be alone in the right one. Those who have followed the assertive idiot rather than the introspective wise person have passed us some of their genes. This is apparent from a social pathology: psychopaths rally followers.”
* * *
Those who share my high regard for this book are urged to check out the aforementioned MIT Urban Planning Report, Dancing with Robots, as well as these additional sources: Daniel Kahneman's Thinking, Fast and Slow; Taleb's aforementioned work, The Black Swan: Second Edition: The Impact of the Highly Improbable: With a new section: "On Robustness and Fragility" (Incerto); and Nate Silver's The Signal and the Noise: Why So Many Predictions Fail - But Some Don't.
Want to see more reviews on this item?
Most recent customer reviews
Look for similar items by category
- Books > Business & Investing > Management & Leadership > Planning & Forecasting
- Books > Professional & Technical > Business Management > Management & Leadership > Planning & Forecasting
- Books > Professional & Technical > Professional Science > Behavioural Sciences > Cognitive Psychology
- Books > Professional & Technical > Professional Science > Behavioural Sciences > Cognitive Science