Monthly Archives: July 2013

Talking About Risk

Drew Rae gave a talk to Edinburgh Skeptics (sic) on “Dealing Reasonably with Irrational Fear”. At least, that was meant to be the topic, but his talk was more about how bad people (and that’s all of us…) are at judging and assessing risk, rather than dealing with it.

It was very interesting, though; given how we deal with risk the whole time, it never fails to amaze how bad we all are at estimating risk – and as Drew pointed out, even those whose job it is to measure risk can be several orders of magnitude out: he mentioned an experiment where professional risk assessors were asked to measure the same (risky) scenario, and the answers differed by orders of magnitude. In another experiment, it was found that experts

Much of this covered similar ground to Daniel Kahneman‘s “Thinking Fast and Slow” (though in my case, reading slow, too!) in examining the biases we all have in assessing data. We are overconfident in our own abilities and misconstrue the evidence. We even make up evidence (unwittingly – I hope) to fit our beliefs.

Much of this is down to the difficulty of working with low probability events: Drew showed how our uncertainty of the probability of a rare event happening is much, much higher than the risk of the event itself.

The way we consider risk also depends a lot on the way we frame the question, and our emotional response to it. People use quick rules of thumb – heuristics – to gauge probability, and whilst these might work well in everyday situations, they let us down badly when considering rare events.

As well as “judgement errors” arising from our heuristic mental models, Drew described how even professional risk managers make large systematic errors in assessing risk. For instance, they can forget or ignore whole categories of hazard, based on their own biases, and greatly over estimate their ability to predict the categories of hazard they do include. Their overconfidence stems from a certainty that their data are correct, an over estimation in the efficacy of safeguards, and – somewhat shockingly – relying on incorrect and untested assumptions, particularly regarding the independence of unlikely events.

For instance, everyone knows that multi-engined aircraft are designed to fly on fewer engines than they have: over capacity is built into the system, so that a plane may land even if one or more of its engines fails. Which is fine, until you consider that the likelihood of engine failure may depend on the experience and skills of the person maintaining the engines; and, typically, the same engineer will do maintenance on all of an aircraft’s engines. If they screw up on one engine, they are likely to screw up on more than one: the likelihood that one engine fails is not independent of the other engines failing.

Indeed, Rae reckoned that aircraft rely on so many incompletely tested systems that our reliance on aircraft remaining in the air was largely “faith based”! He reckoned that for any one aircraft, the likelihood of an accident is one in 10,000 years – and that to get adequate data to test this, we’d need to test aircraft for 20,000 years. Instead, we are happy to fly.

Aircraft are much safer than cars, though. If we applied the same standards of safety and maintenance to our cars as we do the aircraft engines, Rae reckoned we’d never drive: the processes would be far too cumbersome. In the UK, there are approximately 300,000 casualties and 3,500 deaths on our roads. I can’t find comparable data for deaths due to aircraft in the UK – perhaps a result of its rarity – but Wikipedia lists data from ACRO averaging 1,186 pa for the whole world.

Much depends on how one states risk. Figures for the USA show that an individual has 1/7,700 chance of dying in a road accident in a year, 1/306,000 chance of dying in a train accident, and 1/2,067,000 chance of dying in a plane crash. That is, you are forty times more likely to be killed in a car crash than a train crash, and over 250 times more likely to be killed in a car crash than in a plane crash.

On the other hand, in terms of miles travelled, car and trains have the same risk (1.3 deaths per 100 million vehicle miles) and planes slightly higher (1.9 deaths per 100 million aircraft miles). So cars are either safer than planes – or more dangerous, depending on your point of view. But a caveat: these figures are for the period 1999-2004, and therefore include the deaths of those on the planes involved in the terrorist attacks on the World Trade Center (a footnote says that the other deaths in 9/11 have been excluded from the figures). Wikipedia has rather different figures for the same measure, worldwide: death per billion kilometres travelled for air is 0.05, for rail 0.6, and for car 3.1; but neither sources nor the period covered are given.

[It has been said that more people may have died as a result of an increased aversion to flying in the USA following the airborne tourist attacks on 9/11 than in the attacks themselves: in the USA, road deaths increased by 1,500 in the year after 9/11, and it is easy to envisage that . (This paper attributes 1,200 more deaths on American roads to 9/11 in the year after the attacks, whilst this one estimates that, over time, a total of 2,170 deaths could be attributable to change of behaviour following 9/11 [PDF].) We dramatically over estimate the risk of dying in a terrorist attack. Terrorist attacks are very rare events, at least in western Europe and USA.]

We let our fears influence our perception of risk, too. Take nuclear power. There have been approximately five thousand deaths as a result of accidents at nuclear power plants (with Chernobyl responsible for four thousand of those). Coal fired power stations have result in 22,000 premature deaths each year in Europe alone. For the USA, it’s 13,000 deaths pa [PDF]; it would be fair to assume significantly more in China and Russia. Let’s be conservative and say 50,000 premature deaths worldwide from coal. But the perception is that nuclear power is dangerous.

Indeed, analysis of worldwide deaths from a variety of energy sources show that per unit of energy (terawatt hours), nuclear energy is the safest source of energy. (Wind doesn’t feature.)

There have to be caveats, of course: if the danger from nuclear waste are included – which remain hazardous for for hundreds of thousands of years (according to Greenpeace) – the picture could well be different.

Rae summarised the influences – biases – over our judgement of risks:

  • framing – how we ask the questions changes the results
  • familiarity – the more (we think) we understand something, the less risky it seems – the less scary, even – but also the more common we might think it, too
  • the extent to which the risk is voluntary – and hence how we feel we can influence it
  • percieved ability to control the risk (and 93% of drivers think they are above average…)
  • a preference to eliminate rather than just reduce risk
  • and a “dread factor” – an irrational fear: of nuclear power, carcinogenic compounds, genetic-manipulated organisms, and many, many other things.

I reckon the media has a lot to answer for. News headlines make unlikely events appear more common – headlines stick in our memories, media campaigns can last months and have a disproportionate effect.