Maybe not so much. Jonah Lehrer, contributing editor at Wired, and blogger at The Frontal Cortex writes,
In the early 1980s, Philip Tetlock at UC Berkeley picked two hundred and eighty-four people who made their living “commenting or offering advice on political and economic trends” and began asking them to make predictions about future events. He had a long list of pertinent questions. Would George Bush be re-elected? Would there be a peaceful end to apartheid in South Africa? Would Quebec secede from Canada? Would the dot-com bubble burst? In each case, the pundits were asked to rate the probability of several possible outcomes. Tetlock then interrogated the pundits about their thought process, so that he could better understand how they made up their minds. By the end of the study, Tetlock had quantified 82,361 different predictions.
After Tetlock tallied up the data, the predictive failures of the pundits became obvious. Although they were paid for their keen insights into world affairs, they tended to perform worse than random chance. Most of Tetlock’s questions had three possible answers; the pundits, on average, selected the right answer less than 33 percent of the time. In other words, a dart-throwing chimp would have beaten the vast majority of professionals. Tetlock also found that the most famous pundits in Tetlock’s study tended to be the least accurate, consistently churning out overblown and overconfident forecasts. Eminence was a handicap.
Lehrer worries that bad expert advice can “reliably tamp down activity in brain regions” that monitor errors and mistakes.
For Tetlock, go here.
I am skeptical of the mechanistic, brain-based explanation Lehrer offers. People often believe things because the social rewards of belief are greater than the social rewards of disbelief.
For example, if I said that I didn’t believe that polar bear numbers are drastically decreasing (see also here), some people out there would assume that I enjoy torturing kittens on my break, and would not accept my view as a considered judgement. And if they can find a pundit to back them up, that is all they need. The problem is that they then vote for public policy that might not work out the way they hope.
Here is an example:
The United States now bans the import of polar bear skins, to protect the bear. I am sure many New York socialites toasted the decision. Maybe prematurely. The former practice of Inuit (Canadian Eskimos) was to sell their bear tag (the ancestral right to kill a bear) to a wealthy American hunter, an important source of income for communities much poorer than the socialite’s.
The American, much of the time, never shoots a bear and just goes home. Then the tag is forfeit. So the number of tags in circulation is not equal to the number of dead bears, and is a poor predictor of population size. However, with no American market, the local hunter keeps his tag – and keeps on trying until he shoots a bear. For more on this and related issues around counting bears, see this Maclean’s article. (Maclean’s is our national news magazine.)
I suppose it is some consolation that at least the number of tags in circulation will now more closely correlate to known bear deaths. Of course, more bears will die, but that is a small price to pay for the opportunity to be guided by the pundit.
Hat tip: Stephanie West Allen at Brains on Purpose