BP’s Gulf disaster was no surprise to those who understood the corporate culture.
Jun 28, 2010, Vol. 15, No. 39 • By ANDREW B. WILSON
To put it even more bluntly, BP was taking a don’t-sweat-the-big-stuff attitude toward safety. Others noticed the same thing. Robert Bea, a professor of engineering at the University of California, Berkeley, and a well-known expert on catastrophes involving complex systems, reached the same conclusion based on his own association with BP in 2002 and 2003. At the company’s request, Bea studied BP’s approach to catastrophic risk management at its U.S. facilities in Texas City, Prudhoe Bay, and Cherry Point, Washington, and made recommendations directly to John Browne, then CEO of BP, and other members of top management.
Hearing of his work and knowing that he had launched an independent study (separate from ongoing government studies and investigations) of the disaster in the Gulf, I sent an email to Bea in early June, showing him Houston’s critique of the prevailing attitude toward safety inside BP and asking if he agreed. He immediately replied:
In subsequent interviews, Bea told me that BP had paid promptly and well for his report, but he saw no sign that they were prepared to act any differently than before. About two years later, on March 23, 2005, BP had a major explosion at its Texas City refinery that left 15 people dead and more than 170 injured. Again, BP admitted breaking rules. This time it did not get off so lightly: It was hit with $137 million in fines—the heaviest workplace safety fines in U.S. history.
Following the Texas City accident, an independent report on safety at the five BP refineries in the United States, known as the Baker Panel Report, came to pretty much the same conclusion that Bea had reached before the accident. As Browne, who retired as CEO in 2007, states in his memoir, published early this year, the Baker Panel found that “we had not done enough to make process safety a core value. We had emphasized that individuals had to be safe when they went about their daily work—‘personal safety.’ . . . But we had not emphasized that processes and equipment had to be safe under all circumstances and operated in a safe way at all times—‘process safety.’ ”
It may seem surprising that BP would fail to emphasize that highly sophisticated equipment involved in the extraction and processing of highly combustible material had to be “operated in a safe way at all times.” But those are its former CEO’s own words.
A+B=D, where D is disaster
Bea, 73, has been investigating disasters for almost 50 years. After obtaining a master’s degree in engineering, he joined Shell Oil and worked in jobs that ranged from roustabout to manager of the company’s offshore technology development group. In 1961, an offshore military radar platform near New York City collapsed into the sea, killing 28 people. Shell asked Bea to look into the causes of this accident and report back with any lessons that might be useful to the company in setting up its first deepwater oil platforms.
Since then, Bea has investigated more than 20 offshore rig disasters, and, as an independent researcher, he has also investigated the Columbia space shuttle disaster for NASA and the collapse of New Orleans levees during Hurricane Katrina for the National Science Foundation. He became a risk assessment specialist for Bechtel after leaving Shell and later moved to the engineering department at Berkeley, where he cofounded the Center for Catastrophic Risk Management in 2005, after Hurricane Katrina.
The A in Bea’s theorem, as he calls it, stands for all of the immense technical challenges that a complex engineered system must overcome in order to be successful over time—everything from earthquakes and hurricanes to “flying cows” (a favorite Bea expression). A includes all the difficulties and surprises that the external environment throws in your way, such as having to deal with the immense cold and pressure deep under the sea.
The B stands for people and all of the faults to which individuals and human organizations are prone—everything from indolence or laziness to arrogance, hubris, and greed. B includes the panicky decision a manager makes when he decides to cut corners on safety—by ignoring or falsifying a test result, for example, or waiving one or more of a series of duplicative or triplicative precautions aimed at preventing a catastrophe—in order to stay on schedule and meet short-term financial objectives.