An operator caught the spike in chlorosilanes’ hydrostatic pressure just in time. The operators at the site were trained; they knew the risk. They were what training psychologists call “well calibrated” — that is, as good as they thought they were. On the flip side, if you aren’t as good as you think you are, you’re poorly calibrated. This operator had the right amount of confidence — he had the sense to be afraid. Most of us are poorly calibrated and over-confident.
Examples of well-calibrated people are all around us. Believe it or not, weather forecasters generally get it right. They estimated the percent precipitation in 150,000 three-day forecasts almost exactly — because of immediate feedback. Executives at Warren Buffett’s Berkshire Hathaway posted a 400,863% gain in the value of their investments from 1964–2007, compared to the paltry 6,840% increase for business in general reported by Standard & Poor’s. That’s because Buffett’s managers stick around long enough to see their mistakes and learn from them. Executives generally have the worst track record for calibration.
In our industry, people make life-and-death decisions every day, often without knowing it. Unfortunately, too many suffer from over-confidence.
The Baker report on the March 2005 fire at BP’s Texas City refinery that killed 15 people cited management over-confidence. The high turnover of refinery managers didn’t allow them to grasp the danger until it was too late. Engineers and contractors working every day in the units knew the hazards; so did outsiders like consultants and the U.S. Occupational Safety and Health Administration. The report also faulted unchallenging operator simulators — no doubt they contributed to the inadequate response.
You can fight over-confidence in a variety of ways.
Periodically shift personnel around in a facility to broaden their skills and use them more effectively. People in a new environment spot things that don’t stand out to those familiar with that environment. Psychology bears this out. Such an approach might seem contradictory to the high turnover issue that plagued BP managers, but it isn’t — people need time to understand a process.
Another idea is to try to match risk-adverse people with risk-takers to nullify over-confidence.
Tweaking procedures also can help. For instance, change the route during a process walkdown to keep it from being on autopilot. When I review a set of process and instrumentation drawings (P&IDs), I follow the feed lines out, then in again in reverse; I look for obvious things like knowing that each tank must have an inlet and outlet.
In addition, recognize concentration limits. Psychologists have found that people can’t handle more than three-to-four details at once. So, whether in the field or an office, don’t try to review all errors at once. With P&IDs, focus on no more than three particular errors — say, whether a valve is open or closed, whether it’s the right type, and whether it’s flanged or not. Go through the drawings looking for those errors and ignoring others until it’s their turn.
Psychologists also have found that people are easily distracted (not a great surprise). This can impact many activities. For instance, during a critical item walkdown, a manager may want to inspect a just-repaired pump but may get distracted and fall into the routine of an ordinary walkdown, forgetting the pump until it’s too late — during startup. That’s why personal checklists are so crucial.
Now, let’s get back to the over-confidence issue again. Simulators are key to calibrating your control board operators. You want the right balance: poised but not cocky, alert and not bored, and focused on the right things. Experienced operators excel at ignoring the background noise and concentrating on the key controls; they just miss the obvious. When I began a project at Andrew Jergens Co., I knew a reactor pump was failing — everyone else had learned to ignore the noise. Balancing operator tension depends upon how often and difficult the simulator training is. You want to test the abnormal but also want operators to see normal in many different ways. You want to imagine how a process can fail and create knowledge-based performance in your team. You want to encourage experimentation. Consider the 1989 Sioux City DC-10 crash. Nobody ever imagined a plane could be maneuvered by engines alone without hydraulics. The aircrew worked it out. The idea of simulation is posing difficult problems experienced operators can fail at.
Another way to address over-confidence is by not treating an initial decision as final. It pays to take a second look. This is called the Monty Hall paradox (after the original host of the game show “Let’s Make a Deal”) — changing your mind doubles your chances. High school students mistakenly still are advised to stick to their first answer on the SAT. We must teach operators to bravely second-guess themselves and each other.
For additional insight into over-confidence and other mental traps read, “Why We Make Mistakes” by Joseph Hallinan.
DIRK WILLARD is a Chemical Processing Contributing Editor. You can e-mail him at firstname.lastname@example.org