Читаем Enlightenment Now: The Case for Reason, Science, Humanism, and Progress полностью

Twenty years and twenty-eight thousand predictions later, how well did the experts do? On average, about as well as a chimpanzee (which Tetlock described as throwing darts rather than picking bananas). Tetlock and the psychologist Barbara Mellers held a rematch between 2011 and 2015 in which they recruited several thousand contestants to take part in a forecasting tournament held by the Intelligence Advanced Research Projects Activity (the research organization of the federation of American intelligence agencies). Once again there was plenty of dart-throwing, but in both tournaments the couple could pick out “superforecasters” who performed not just better than chimps and pundits, but better than professional intelligence officers with access to classified information, better than prediction markets, and not too far from the theoretical maximum. How can we explain this apparent clairvoyance? (For a year, that is—accuracy declines with distance into the future, and it falls to the level of chance around five years out.) The answers are clear and profound.

The forecasters who did the worst were the ones with Big Ideas—left-wing or right-wing, optimistic or pessimistic—which they held with an inspiring (but misguided) confidence:

As ideologically diverse as they were, they were united by the fact that their thinking was so ideological. They sought to squeeze complex problems into the preferred cause-effect templates and treated what did not fit as irrelevant distractions. Allergic to wishy-washy answers, they kept pushing their analyses to the limit (and then some), using terms like “furthermore” and “moreover” while piling up reasons why they were right and others wrong. As a result, they were unusually confident and likelier to declare things “impossible” or “certain.” Committed to their conclusions, they were reluctant to change their minds even when their predictions clearly failed. They would tell us, “Just wait.”46

Indeed, the very traits that put these experts in the public eye made them the worst at prediction. The more famous they were, and the closer the event was to their area of expertise, the less accurate their predictions turned out to be. But the chimplike success of brand-name ideologues does not mean that “experts” are worthless and we should distrust elites. It’s that we need to revise our concept of an expert. Tetlock’s superforecasters were:

pragmatic experts who drew on many analytical tools, with the choice of tool hinging on the particular problem they faced. These experts gathered as much information from as many sources as they could. When thinking, they often shifted mental gears, sprinkling their speech with transition markers such as “however,” “but,” “although,” and “on the other hand.” They talked about possibilities and probabilities, not certainties. And while no one likes to say “I was wrong,” these experts more readily admitted it and changed their minds.47

Successful prediction is the revenge of the nerds. Superforecasters are intelligent but not necessarily brilliant, falling just in the top fifth of the population. They are highly numerate, not in the sense of being math whizzes but in the sense of comfortably thinking in guesstimates. They have personality traits that psychologists call “openness to experience” (intellectual curiosity and a taste for variety), “need for cognition” (pleasure taken in intellectual activity), and “integrative complexity” (appreciating uncertainty and seeing multiple sides). They are anti-impulsive, distrusting their first gut feeling. They are neither left-wing nor right-wing. They aren’t necessarily humble about their abilities, but they are humble about particular beliefs, treating them as “hypotheses to be tested, not treasures to be guarded.” They constantly ask themselves, “Are there holes in this reasoning? Should I be looking for something else to fill this in? Would I be convinced by this if I were somebody else?” They are aware of cognitive blind spots like the Availability and confirmation biases, and they discipline themselves to avoid them. They display what the psychologist Jonathan Baron calls “active open-mindedness,” with opinions such as these:48

People should take into consideration evidence that goes against their beliefs. [Agree]

It is more useful to pay attention to those who disagree with you than to pay attention to those who agree. [Agree]

Changing your mind is a sign of weakness. [Disagree]

Intuition is the best guide in making decisions. [Disagree]

It is important to persevere in your beliefs even when evidence is brought to bear against them. [Disagree]

Перейти на страницу:

Похожие книги