Get To The Root Of Accidents

Systems thinking can provide insights on underlying issues not just their symptoms

By Nancy Leveson, Massachusetts Institute of Technology, and Sidney Dekker, Griffith University

Share Print Related RSS
Page 3 of 3 1 | 2 | 3 Next » View on one page


In the same report, the operators are blamed for not taking prompt enough action when the toxic chemical alarm detected the chemical in the air and finally sounded. The report concluded that "interviews with personnel did not produce a clear reason why the response to the … alarm took 31 minutes. The only explanation was that there was not a sense of urgency since, in their experience, previous … alarms were attributed to minor releases that did not require a unit evacuation." The surprise here is that the first sentence claims there was no clear reason while the very next sentence provides a very good one. Apparently, the investigators did not like that reason and discarded it. In fact, the alarm went off about once a month and, in the past, had never indicated a real emergency. Instead of issuing an immediate evacuation order (which, if done every month, probably would have resulted in at least a reprimand), the operators went to inspect the area to determine if this was yet another false alarm. Such behavior is normal and, if it had not been a real emergency that time, would have been praised by management.

Hindsight bias is difficult to overcome. However, it is possible to avoid it (and therefore learn more from events) with some conscious effort. The first step is to start the investigation of an incident with the assumption that nobody comes to work with the intention of doing a bad job and causing an accident. The person explaining what happened and why it happened needs to assume that the people involved were doing reasonable things (or at least what they thought was reasonable) given the complexities, dilemmas, tradeoffs and uncertainty surrounding the events. Simply highlighting their mistakes provides no useful information for preventing future accidents.

Hindsight bias can be detected easily in accident reports (and avoided) by looking for judgmental statements such as "they should have …," "if they would only have …", "they could have …" or similar. Note all the instances of these phrases in the examples above from the refinery accident report. Such statements do not explain why the people involved did what they did and, therefore, provide no useful information about causation. They only serve to judge people for what, in hindsight, appear to be mistakes but at the time may have been reasonable.

Only when we understand why people behaved the way they did will we start on the road to greatly improving process safety.

ESCAPING THE WHACK-A-MOLE TRAP
Systems are becoming more complex. This complexity is changing the nature of the accidents and losses we are experiencing. This complexity, possible because of the introduction of new technology such as computers, is pushing the limits that human minds and current engineering tools can handle. We are building systems whose behavior cannot be completely anticipated and guarded against by the designers or easily understood by the operators.

Systems thinking is a way to stretch our intellectual limits and make significant improvement in process safety. By simply blaming operators for accidents and not looking at the role played by the encompassing system in why those mistakes occurred, we cannot make significant progress in process safety and will continue playing a never-ending game of whack-a-mole.

REFERENCES
1. Leveson, N. G., "Engineering a Safer World: Systems Thinking Applied to Safety," MIT Press, Cambridge, Mass. (2012).
2. Leveson, N. G., "Applying Systems Thinking to Analyze and Learn from Accidents," Safety Science, 49 (1), pp. 55–64 (2011).
3. Dekker, S. W. A., "The Field Guide to Understanding Human Error," Ashgate Publishing, Aldershot, U.K. (2006).
4. Dekker, S. W. A., "Just Culture: Balancing Safety and Accountability," 2nd ed., Ashgate Publishing, Farnham, U.K. (2012).



NANCY LEVESON is professor of aeronautics and astronautics and professor of engineering systems at the Massachusetts Institute of Technology, Cambridge, Mass. SIDNEY DEKKER is professor of social science and director of the Safety Science Innovation Lab at Griffith University, Brisbane, Australia. E-mail them at leveson@mit.edu and s.dekker@griffith.edu.au.

Page 3 of 3 1 | 2 | 3 Next » View on one page
Share Print Reprints Permissions

What are your comments?

You cannot post comments until you have logged in. Login Here.

Comments

  • The "Mental Model" figure reminds me of the famous cartoon "How Projects Really Work" or "What the customer really wanted" - http://www.edugeek.net/attachments/forums/general-chat/15350d1348823969-why-do-projects-fail-6a00d83451f25369e20120a513810c970b-800wi.jpg

    The cartoon and the article show that communication between the very different worlds of management, engineers and operator / maintenance personal can be almost impossible. This almost always leads to blaming the lowest ranking individual involved in an accident - a time-honored but fault-laden custom.

    In addition to improving communication, I would put more emphasis on "near-miss" reporting. By finding and eliminating the root causes (almost never a single root cause) of near misses, actual hits - accidents - can be prevented. Been there, done that.

    Reply

  • Thanks for your comment Paul. You might want to read the article "Learn More from Near-Misses" >> http://www.chemicalprocessing.com/articles/2014/risk-management-learn-more-from-near-misses/

    And we love the cartoon.
    Regards,
    Traci Purdum
    Senior Digital Editor

    Reply

RSS feed for comments on this page | RSS feed for all comments