Avoid Errors | Chemical Processing

You can reduce human error in chemical processing plant instrumentation and operation by understanding the types and sources of error.

Share Print Related RSS
Page 1 of 3 « Prev 1 | 2 | 3 View on one page
This two-part article examines human error, how it occurs, and how to measure and reduce its occurrence in engineering design, construction, operations and maintenance.

To err is human, and mistakes play a part in any human activity. However, in chemical processing applications, human error, whether of omission or commission, can have disastrous consequences. Part 1 of this article examines types of errors and how they typically occur in instrumentation and equipment design, engineering and maintenance.

There are many ways to classify human error. In active errors, the mistake and its consequences are apparent immediately. In contrast, latent errors require time, conditions or another action before becoming evident.

Errors also can be random or involve human factors stemming from procedures, management practices, equipment design or some other trigger. Most accidents are attributable to human factors.

Battelle Memorial Institute, for example, studied 136 refinery incidents and determined that human error was involved in 47 percent of those accidents. Of that 47 percent, 19 percent were random, but 81 percent involved human factors [1].

Generally, there are reasons why people make errors. Understanding these reasons, categorized below, might help prevent them:

People-oriented errors

Slips, lapses or errors in execution: Slips are actions that occur despite your best intentions. They can be caused internally by short-term inattention or externally by distractions. Examples would include reversing numbers or letters, or misspelling words you know how to spell. Slips tend to recur on routine tasks.

Capture error: These errors occur when you go on "auto-pilot","substituting a frequently performed activity for a desired one. For example, you miss a highway exit on your way somewhere after work, taking the route home instead.

Identification error: These occur when something is identified incorrectly. In Battelle's refinery study, 75 percent of all human errors concerning equipment involved mislabelling. The current downsizing trend has increased the potential for this type of error, due to increased reliance on equipment, piping and instrument/electrical tagging and identification.

Impossible tasks: Some assigned tasks are unreasonably complex, impractical or even impossible to do. As a result, workers might take short cuts or develop alternative methods that can lead to error. In general, the more complex an action, the more likely an error will be made while carrying it out. Consider a control system designed so that an abnormal condition triggers hundreds or even thousands of alarms, overwhelming operators. Drawings and distributed control system (DCS) screens that are too busy increase the potential for information overload and error.

Input or misperception errors: With errors of this type, the information needed to make a decision or perform a task might be misunderstood, perhaps because it has been presented in an overly complex or misleading way. Overly complex instrumentation systems are a source of this type of error. In other cases, data might be missing, forcing users to make the wrong assumptions.

Lack of knowledge: This type of error occurs when someone fails to get appropriate information from other people or departments. In operations, it could be a lack of situational awareness during abnormal conditions. Lack of knowledge leads to assumptions that, as the old saying goes, are "the mother of all screw ups."

Mindset: Mindset generally is a function of expectations and habits. All too often, people see what they expect to see, despite evidence to the contrary, particularly under high stress or time constraints.

Equipment should be designed intuitively, with user habit in mind. Examples of error-prone designs include a valve sequence that is not in a normal order (e.g., open valve 1, close valve 3, open valve 2, open valve 6, close valve 4...), or, in countries in which the written word moves from left to right, a sequence that does not progress in that direction (e.g., pumps labeled C, B, A from left to right rather than A,B,C). Other problematic cases involve color coding in one section of the plant that does not match the scheme used in the rest of the plant.

Mindset can result from folklore, habit, faulty experience or rationalization of experience. It also can occur by applying habitual actions in the wrong situations. A number of accidents have occurred because operators didn't believe their instruments because "it just couldn't be true."

Over-motivation or under-motivation: Over-motivation can come from being too zealous, (e.g., completing a job too quickly just to please a supervisor). Working too fast can lead to shortcuts and risk-taking. High-pressure production environments with incentives, for example, can lead to these problems. Ironically, under-motivation stemming from boredom or lack of morale also can lead to shortcuts and risk-taking. However, it is more likely to cause latent errors, discovered later by someone else.

Reasoning error: These errors occur when a person has the correct information to make a decision or take an action, but comes to the wrong conclusion. Lack of training and/or experience facilitates this type of error.

Task assignment mismatches: These occur when staffers simply are mismatched to the task at hand.

Situation-oriented errors

Environmental: Some errors can be facilitated by environment. Examples include poor location, insufficient work space, temperatures that are too high or too low, insufficient light, too much noise and too many distractions.

Stress-related: Decision-making under stress can lead to errors. Some sources indicate if a person is under high stress, the chances of error are as much as 50 percent. The stress might come from the decision or act, from the work environment, from other people or from outside work. High-pressure production environments, for example, can lead to operator uncertainty and stress when one of the decisions includes shutting down production.

Page 1 of 3 « Prev 1 | 2 | 3 View on one page
Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments