According to Harvard biologist George Wald, civilisation will end in 15-30 years without immediate action. Ecologist Paul Ehrlich says up to 200 million will starve to death in the next ten years due to population increases, which will also cause the deaths of 65 million Americans. Furthermore, oil will be all gone in the next 30 years, if we are to believe ecologist Kenneth Watt. All sounds pretty plausible right?
Except these predictions were made in 1970.
And you don’t have to look far to find other experts who should know better being spectacularly wrong.
Lord Kelvin said in 1883 that x-rays would prove to be a hoax.
In 1903 president of Michigan Savings Bank Horace Rackham wisely intoned that the car is just a fad but the horse is here to stay.
Nuclear energy will never be obtainable, said Einstein in 1932.
In 2007, Microsoft CEO Steve Ballmer claimed the iPhone had no chance of success.
It’s easy to find lots more examples like these. And perhaps I’m just cherry-picking failed predictions to support my point. Well, yes… But.
There is a remarkable experiment conducted by psychologist Phillip Tetlock that tested expert ability at forecasting future events, and the results are depressing – though not without a glimmer of hope.
Tetlock gathered 280 experts from all kinds of academic, political, business and intelligence fields and had them make predictions – 28,000 forecasts on a wide range of subjects. And then he waited to see how many of these predictions came true. He waited for thirty years.
The results were shocking. The average expert performed worse than chance and was, according to Tetlock, worse at predicting the future than a chimp throwing darts at a board. There was no correlation between accuracy of the expert’s forecasts and their political opinions, gender, access to classified intelligence or level of education. There was no doubt – the average expert’s forecasts are simply rubbish. This leads to the question as to why so-called experts get things so spectacularly wrong.
Why are those experts full of shit?
For one thing, predicting changes in society, climate and the economy is hard because they’re complex systems. There are impossibly many variables to take into account, and the world is so complicated, chaotic and interrelated that small changes ripple out in unpredictable ways.
Computer models don’t help. They reflect the biases of the programmers and whoever’s funding the research and, like statistics, can be tweaked and adjusted until they cough up the desired result. Computer models are the modern equivalent of the Romans predicting the course of a battle by slaughtering an animal and examining its entrails. Haruspication is what that’s called, and the soothsayers who performed these grisly forecasts were the sage experts of their day. They were THE SCIENCE.
Add to that the fact that humans are subject to psychological biases like confirmation bias (we seek out evidence that backs up what we already believe and discount evidence that doesn’t) and groupthink (there’s enormous social pressure not to rock the boat and to keep your dirty dissident doubts to yourself). Experts are just as susceptible to these kinks in our thinking as the rest of us.
And of course, we like certainty and dislike nuance and ambiguity. To take a controversial example, any discussion of the war in Ukraine is restricted to black and white simplistic analysis – it’s all the fault of everyone’s favourite pantomime villain Big Bad Vlad. It’s Putler’s war and no further discussion of NATO or the Ukrainian government’s roles in contributing to or provoking the conflict is needed. Sanctions plus weapons, that will solve the problem. And if it doesn’t, well it just means we need more sanctions and weapons. If it still doesn’t work, repeat until it does. Journalists who discuss the complexities of this terrible war are likely to find themselves demonetised.[i] Stick to the script and keep it simple. Nuance is confusing and distracts from THE MESSAGING.
Or to take another controversial example, the covid vaccination. Vaccine is a good thing, therefore every vaccine must be a good thing, so anyone who questions the need or the wisdom for the current campaign to inject everyone with leaky vaccines for an illness that is mild for most people is an anti-vaxxer, and that makes them anti-science, a bad person… Nothing to see here, no need to investigate. The experts are certain it’s all fine. Nuance and skepticism are confusing and distract from THE MESSAGING.
The Fox and the Hedgehog
However, there is some room for optimism in the results of Tetlock’s study of expert predictions. Although the average expert was very poor at making predictions, the complete picture had a little more nuance. One cohort of experts seemed to have a modest insight and did slightly better than chance. The other group, though, were even worse than chance would predict – those are the proverbial monkeys throwing darts. Tetlock characterised these two groups as foxes and hedgehogs, based on a fragment of an ancient Greek poem. The poem said that the hedgehog only knows one thing, but it’s a very big thing. The fox, however, only knows little things but he knows many of them. In other words, people’s style of thinking has a profound effect on their accuracy when it comes to assessing the present and forecasting the future. Hedgehogs are hopeless at predicting the future, whereas foxes are much better (though still far from perfect). This is down to their differing thinking styles.
The hedgehog knows one big thing. Therefore, he applies this one idea to all situations and it may prevent him from seeing contradictory evidence. The hedgehog prefers simplicity and clarity. They like black and white answers and are uncomfortable with uncertain, probabilistic answers. Because they have one big idea that they are convinced is correct, they tend to be certain in their opinions as well as being reluctant to change their minds. Big Bad Vlad. Safe and effective. Trust the science.
Foxes, on the other hand, know many little things. In other words, they have different analytical tools that they can apply to different situations. They are comfortable with complexity and accept uncertainty as unavoidable. They tend to be humble and self-critical and have an ability to question their own opinions, which makes them readier to change their minds if they have to.
Tetlock’s research demonstrates that experts who think like hedgehogs are more likely to be wrong and yet feel more certain that they are right. Experts who think like foxes are less certain that they are right, but they are more likely to actually be right. However, the experts who are more likely to be famous are hedgehogs. This is because they offer certainty and simplicity – just what we like.
The paradox is a recipe for disaster. The experts who are most certain they are right are most likely to be wrong. And these experts are also the ones who are more famous and whose opinions we hear the most.[ii]
So the next time you see famous media pundits predicting this and that dire consequence and offering simplistic analyses of complex and multi-layered problems, it’s worth remembering this important insight.
They’re full of shit.
Winston Churchill’s mischievous definition of expertise sums it up best:
“It’s the ability to foretell what will happen tomorrow, next month and next year – and to explain afterwards why it did not happen.”
[i] See for example consortiumnews.com
[ii] See Dan Gardner Future Babble (London: Virgin 2010) for more on this experiment.
Cartoon by Bob Moran
2 thoughts on “Experts are full of sh!t and here’s why…”
That’s brilliant Paul!
LikeLiked by 1 person