Even with safer vehicles, safer roads and safety campaigns we are not safe. There has been no technological fix. Car accidents happen, as the pages of the Herald attest to every day. The only thing we seem to be able to predict with complete accuracy is that technologies can and will go wrong. The mining and maritime industries have sufficiently long histories to know this.
Historian Edward Tenner calls the unforeseen negative aspects of technology their "revenge effects". Technology writer Paul Virilio suggests that every technology has them. When we invent a new technology we also invent the possibility of unintended and unfortunate outcomes. Thus the invention of the ship creates the shipwreck, the invention of the aeroplane the plane crash. In this sense the maritime oil spill was invented as soon as we began drilling oil at sea and fuelling ships with it.
Just who suffers from these technological accidents is an interesting sociological question. Research shows that accidents reveal patterns in which the isolated, the weak and the less wealthy consistently fare worse. Consider car accidents. Ian Roberts, a Professor in Public Health, noted that it is the young rather than the old, the poor rather than the rich, the populations of the global south rather than the north, the pedestrian rather than the driver that overwhelmingly pay the price: 3000 people are killed every day and 10 times that are seriously injured. According to the World Health Organisation's Violence and Injury Prevention and Disability programme 90 per cent of vehicle-related deaths are in the developing world. The financial costs of these accidents exceed what these nations receive in aid payments. One thing we might want to consider, then, is that technologies - whether individually owned like cars or corporately owned like mines and cargo ships - are capable of creating profound social problems.
An oil spill like that from the Rena is a good example of a social problem. It creates a problem for whoever is deemed to have caused it and their insurers, but it also poses problems for those that live and make their livelihoods on the affected coastline and those that wish to visit it, not to mention the wildlife above and below the water.
In 1970 sociologist Harvey Molotch studied an accidental oil spill in Santa Barbara Channel. He noted that the oil industry provided the data that allowed Federal agencies to regulate it and it provided the grants that allowed academics to study it. As last year's Deepwater Horizon oil spill showed, depressingly little had changed. Drilling technologies may have advanced markedly but the clean-up technologies are still rudimentary, and the industry largely regulates itself. Two weeks before what is arguably the United States' worst environmental disaster occurred, President Barack Obama reassured the American public "that oil rigs today generally don't cause spills. They are technologically very advanced". This faith was disastrously misplaced.
There seem to be at least two lessons here: the worst needs to be planned for, and the powerful will always put private gain over social good unless held to account. Industries need the oversight of the state. And for the state to exercise proper stewardship it needs the oversight of the people. Industrial accidents constitute a social issue of the utmost importance: research by disaster scholars like Charles Perrow and figures from the Swiss Reinsurance Company (2011) show they are increasing in scale, frequency and severity. And social problems, by their very nature, always affect broad constituencies.
More oil will spill. The safest approach is to treat all technologies as accidents waiting to happen and to prepare accordingly.
Steve Matthewman is a senior lecturer in sociology at the University of Auckland. His latest book is Technology and Social Theory (Palgrave Macmillan UK).