Fears of further terrorist attacks have forced people to weigh up life's dangers. CHARLES ARTHUR talks to a risk expert.
When he was a child growing up in Toronto, John Adams recalls that he forswore the company of his friends because they used to sneak through a hole in a fence to play past a sign that read, "Trespassers will be Prosecuted".
"I thought it meant electrocuted," explains Professor Adams, as he now is. "I suppose because the electric chair figured much more prominently in our consciousness then."
But couldn't he see that his friends weren't getting electrocuted? "I thought it meant if you were caught," he says. "It was, I suppose, a misreading of a virtual risk."
The phrase "virtual risk" may be one that's unfamiliar, but you are living with its consequences now. It's how Adams, of University College London's geography department, explains the way people should think about the dangers of daily life.
A virtual risk, he explains, is one that simply won't let you evaluate the probability that something will happen. What's the chance that you'll get an anthrax-contaminated letter in the post? You can't estimate it, because you don't know who might be sending such letters or how widespread their attack is. Your perception of the risk is entirely your own.
It's the same for many acts of terrorism now: there's very little hard data, but no shortage of worry - as recent reports that even Hollywood tough guys Bruce Willis, Sylvester Stallone and Arnold Schwarzenegger are refusing to fly, prove.
By contrast, a non-virtual risk is, for example, the chance of a meteor hitting the Earth. Calculating that is a breeze: there are good estimates of how many bits of rock there are floating around in space, what percentage are what size, and what proportion of those might cross the Earth's orbit as we're passing. It's what Adams calls "a risk perceived through science" - that is, provable by the scientific method. It's different from a "directly perceived" risk, such as trying to walk across a busy motorway: you can see for yourself that it's dangerous to try.
So how has the guru of risk reacted to the news that there are terrorists about? Is he doing anything differently? "Not at all," he says brightly.
Others are, though. The awareness of the possibility of terrorism - you might call it the New Risk - has changed peoples' habits in subtle ways. Airlines noticed immediately after the World Trade Center attacks that people gravitated towards the cheaper flights. Some sportspeople have decided that the risks of long-haul travel now outweigh the risk of damaging their careers by not being available for international sport.
More generally, travel plans are being held back longer and then decisions made abruptly, based perhaps on "feel" about the relative level of danger; prognostication mixed with carpe diem. And the FBI - which receives up to 160 warnings of impending terrorist attacks daily - has developed a complex, but secret, formula for deciding which threats to respond to and which to ignore.
In the US, the perceived dangers are greater - which has led people to discount the longer-term danger that overuse of antibiotics will, not just might, lead to antibiotic-resistant infections in the future, in favour of the short-term protection against anthrax offered by Cipro. But is there a rational response to the problem of quantifying risks in everyday life?
Take the example of travelling by plane. If you are scared that your commercial transatlantic flight will be hijacked, you can reduce that risk to zero by chartering one instead. Even better, become a pilot and buy a plane.
Of course, it costs more. Of course, your chartered plane might crash or your piloting skills might not be up to the task, but the risk is quantifiable, based on a wealth of flight statistics. The risk of terrorism, subject to the vagaries of human nature, is not.
What is beyond doubt is that we live in a world full of dangers, totally unexpected, hidden like reefs beneath our everyday lives. Would you have guessed, for example, that last year, 402 people in Britain managed to injure themselves with rubber bands? Or that 238 were maimed by paper clips so badly that the accident had to be reported? According to the British Department of Trade and Industry, which collects figures on workplace injuries, 2707 people were hurt by office files, 1317 by staples and staplers and 713 by drawing pins.
Statistics of this sort - allied to those for home injuries, which show that apparently harmless items such as socks, pantyhose, trousers, bean bags, place mats, bread bins, dustpans and vegetables can and do cause hundreds of incidents requiring the attention of an A&E department - offer the beginnings of a statistical basis from which to calculate the relative size of different risks. They also serve, however, as a reminder of the relentless rise in litigation that has led to a culture in which we are warned to beware the most obvious pitfall.
The trend towards excessively cautious labelling has been tracked over the past couple of years by New Scientist, whose "Feedback" column has been overwhelmed by readers' offerings. You may have heard the sort of thing: "Do not attempt to stop chain with hands", on a chainsaw; "Do not iron clothes on body", on an iron; "Use like regular soap", on a bar of soap.
It's hard not to see such examples as evidence of an obsession with eliminating every conceivable danger that has impaired our ability to distinguish significant and insignificant risks.
"Look at this," says Adams, and turns to his computer. He calls up one of UCL's own web pages, which has advice for geography students who will have to venture out beyond the safety of the university's walls to the unregulated world, where nature is in charge and cannot be sued. The students are told that they must evaluate the risks of any field trip they undertake; this entails reading a 69-page manual with subdivisions of all sorts, such as the terrain they might find. There, in two columns, it offers "Risks" and "Associated Control Measures".
On agricultural land, it notes, there is a "risk of personal injury caused by boundary fences - electric fences, barbed wire, hedges, etc". The advice? "If working close to fences, etc, avoid working with your back to the fence, in case you back into it."
"I said to the safety people that this was totally over the top, and that they only needed to look at the significant risks." Remarkably, he was overruled. "There's a terrific fear of litigation," he says.
Adams feels that the extreme caution being visited on schools following several high-profile accidents on school trips means that "these apparent risks are driving out of existence all sorts of worthwhile activities."
"It has got nothing to do with staff safety, and everything to do with arse-covering."
Yet a minute concentration on the tiniest dangers proved conspicuously unsuccessful in preventing the catastrophes of September 11. Indeed, it may even have contributed to them. Adams calls it "the Titanic effect".
Some time before the World Trade Center attacks, an aeroplane manufacturer came to talk to him about the relationship between investment in safety and the actual level of risk protection that results. It turns out that if you plot this on a graph (with investment up one side and actual protection along the bottom), you get a curve rather like a valley rising to a plateau: steep at first but then levelling off. After a certain point, you don't see much more benefit from having more people looking at bags at the check-in or staring at X-ray machine screens. That's what you would expect.
But here's the surprise: as you spend even more on safety measures, the curve actually falls. Things become less safe. You build an unsinkable liner, and your overconfidence leads to disaster. "Too much safety induces complacency," he explains.
"There's so much safety built in to modern aeroplanes that it's difficult keeping the pilots awake on long trips because the autopilot can do everything."
At airports, the fact that the security chain from check-in to seat is so well-tested means that it's easy to overlook the iceberg that, in hindsight, was always there. A sufficiently determined group of hijackers could have taken control of an airliner at any time in the past 20 years using box-cutters. The fact that none had didn't make it any less possible.
Now, Adams thinks, the society that produced 69-page manuals advising people not to back into fences, which told people that things you heat are hot, has struck its own risk iceberg. The dislocation that you see around you, and hear on the news, is the result of people adjusting to a different level of risk. Insurance premiums, meanwhile, have risen by between 35 and 44 per cent, year on year, in America. Somewhere between the extremes of insouciance and paranoia lies the correct response.
The essential lesson is that we need to re-evaluate the assumptions on which our old perceptions of risk were based. "The Titanic was a microcosm of the society that built it," says Adams. "They trusted the helmsmen and the builders. And they couldn't believe it when it failed."
When it did, world views had to be adjusted. By law, ships henceforth had to be built with enough lifeboats for all passengers. The risk of disaster had to be accepted.
Something similar may be happening today. Perhaps we are already sailing away from the wreckage of the assured, risk-averse society in which Western countries existed just a couple of months ago.
If people have drawn any reliable lessons about risk from the past couple of months, it is that it's safer to be in a bungalow than a skyscraper; it's safer to work from home than in a big political centre; that a decentralised terrorist organisation such as al Qaeda can be more effective than a bureaucratic system like the CIA.
The future may lie in us acting more as individuals and making our own estimates of risks. On whichever side of the fence they lie.
INDEPENDENT
Story archives:
Links: War against terrorism
Timeline: Major events since the Sept 11 attacks
Perceived - and real - risks of terrorism
AdvertisementAdvertise with NZME.