It’s all so familiar. A technological disaster, then a commission examining what went wrong. And ultimately, a discovery that while technology marches on, concern for safety lags. Technology isn’t as fool-proof as it seemed.
Space shuttles shatter. Bridges buckle. Hotel walkways collapse. Levees fail. An offshore rig explodes, creating the biggest oil spill in US history.
The common thread is often technological arrogance and hubris. It’s the belief by those in charge that they’re the experts, that they know that what they’re doing is safe. Add to that the human weaknesses of avoidance, greed and sloppiness.
Cutting-edge technology often works flawlessly. People are amazed. At first, everyone worries about risk. Then people get lulled into complacency by success and they forget that they are operating on the edge, say experts who study disasters. Corners get cut, problems are ignored. Then boom.
Technological disasters, like the BP oil spill, follow a well-worn “trail of tears”, says Bob Bea, a University of California Berkeley engineering professor who has studied 630 disasters of all types. He is an expert on offshore drilling and is consulting with the presidential commission into the Gulf of Mexico spill.
Bea categorises disasters into four groups, one of which is when an organisation simply ignores warning signs through overconfidence and incompetence. He thinks the oil spill falls into that category and points to US congressional testimony that BP ignored problems with a dead battery, leaky cement job and loose hydraulic fittings.
Co-chairman of the oil spill commission William Reilly says it’s that type of root cause – not the equipment failure alone – that the commission will focus on, including looking at the corporate and regulatory “culture” that led to bad decisions.
Bea says disasters don’t happen because of “an evil empire. It’s hubris, arrogance and indolence.”
And disasters will keep on happening. Technological improvements have gradually led to more daring offshore drilling attempts.
“It kind of creeps up on you,” US Energy Secretary Steven Chu says. Then suddenly you realise that now only robots can do what people used to do because the drilling is so deep, he says.
“We’ve been doing this every day, every year, week in, week out, so next week when we go to 1500m, it will be like last week when we went to 90m,” he says.
Rutgers University professor and author of the book Worst Cases Lee Clarke says: “It’s just the arrogant presumption that you have the thing under control, whatever the thing is. In this case, it’s drilling beyond your depth.”
The Y2K computer bug is noteworthy for prevention, Clarke says. Many people scoffed and criticised the government for making such a big deal of something that turned out to be a fizzle. But that’s because of all the effort to prevent the disaster. It worked, he says.
Unfortunately, safety costs money so it’s usually not a priority, Clarke says. Most of the time “you can’t get anybody to listen”.
“We’re very reactive about disasters in the US.” People don’t think about them until afterward, he says, and then they say: “You should have seen that coming.”