Avoid Errors - Part 2

Part 2: The Human Factor and System Errors

By Bill Mostia Jr., Exida

Share Print Related RSS

Take a systems approach, analyzing human factors, ergonomics, culture, facility design and management systems

In September's issue, Part 1 discussed how people make errors and some ways to minimize these errors in process plant design, instrumentation and operations. This final installment examines human factors in greater depth, and outlines different ways to use them to reduce the likelihood of errors.

When you read in the newspaper that an airplane crash occurred because of "pilot error," or when you hear that a process plant made off-spec product because of a miscue by an operator, ask yourself: "Were these accidents really caused by human error, or were system errors at fault?" In system errors, human error is only part of a much larger problem. Typically, flawed systems will fail to take "human factors" into account. Yet, few companies today take a systematic approach to minimizing human error (bar chart).

To analyze human errors, one must apply relevant information about human behavior and characteristics. One must also define human characteristics that can be applied to people who are interacting with systems and devices. Both of these concepts are included in "human factors."

Some human factors are based on a person's physical and mental limitations or inherent behavior. Others can be based on psychological or sociological factors. Human-to-human interaction is based on psychology, while sociological factors such as group dynamics can be culturally or ethnically based.

Culture-based human factors can stem from the local culture, for example, of a plant or area, or may be ethnic- or society-based.

 

Systemic error-reduction is needed

A recent survey of companies in the U.S. and Japan found that few are taking a systemic approach to reducing human error, and even fewer are using novel techniques such as Poka Yoke, a technique developed in Japan that is widely used by the world's discrete manufacturing industries.
Source: Chao, Survey of 50 U.S. and Japanese Companies, cited in "Traditional Methods to Control Human Risk," NASA, 9/18/2002, p. 19.

 

Each facility has its own way of doing things or responding to change or other stimuli. One cultural problem is the "not invented here" syndrome, in which people take no interest in "best practices" outside of their organization or have no curiosity about how other people do things. Another is the "it can't happen here" syndrome, in which people are extremely resistant to change.

An ethnic- or society-based culture reflects the norms for the ethnic or society group. A cultural human factor, for example, might be the way people read. English is read from left to right while some other languages, such as Arabic, are read from right to left. Another such cultural factor is that some societies prefer group consensus over individual action.

Human factors also can be situational, depending on the way people interact with a particular situation or set of conditions. For example, one plant might arrange process equipment for a particular operation in one way, while another might arrange it differently. Similarly, one plant might assign a large group for a specific operation, and another, a small number of people. These differences would affect the way that each plant reacted to a particular situation.

Not all human factors are bad

Some human factors are good, such as those that may minimize errors or improve performance.

Others, however, can cause errors. Consider:

  • management systems (communication, training, scheduling, culture, style, work load, etc.),
  • procedures (response to upset, operational procedures, plant practices, etc.),
  • physical factors (ergonomics),
  • organization (presentation, order, structure, etc.), and
  • facility design (equipment, controls, environment, etc.) [1].

Also bear in mind how people process information. Ask yourself the following:

  • How much information can a person process at a time?
  • How quickly can a human being process information?
  • What role do short-term and long-term memory play?
  • How do people handle complex situations?
  • What role does individual mindset play?
  • How do human interactions or "group think" affect behavior?

Human factors exist everywhere in the lifecycle of a process plant or instrument system. Anything that can cause difficulties during implementation, operation, and maintenance can lead to human factor-facilitated errors.

If you ignore how people really work and think when you're developing design, operation, and maintenance procedures, practices, and systems, you're only facilitating errors and poor performance. Examples would be failing to do upfront engineering and design work properly for a new process control system, having a poor change-management system, supervising operators inadequately, using overly complex instrument operation or work procedures, or placing an instrument in a location where it difficult to work on.

Facilities or organizations are often guilty of "scapegoat syndrome," blaming one individual's errors for accidents while ignoring underlying human factors. This is taking the easy way out, and won't improve safety or quality.

 

Mechanical failure doesn't tell the whole story

 

Extracted from four separate EPA databases, these data on chemical plant accidents show the respective roles played by mechanical failure and human error as an initiating event. Note that mechanical failure is almost twice as likely as human error to initiate an accident. However, these data were compiled before detailed analyses had been completed. Consider system errors or deficiencies in human factors that may have led to the mechanical failure.
Source: Mary Kay O'Connor Institute for Process Safety



Consideration of human factors and their effects should be part of the lifecycle of any process plant or control or instrumentation system. In fact, human factors should be considered in all designs, procedures, and practices as a value-added practice and in some cases a matter of law. OSHA regulation 29 CFR 1910.119, Process Safety Management (PSM), for example, requires that the process hazards analysis (PHA) address human factors.

Errors may be classified by whether they affect safety or not. Safety errors may result in an accident, a near miss or an accident waiting to happen. Safety errors caused by humans in safety instrumented systems (SIS) are called "systematic" errors. In an often-cited study of control and safety system failures by the U.K.'s OSHA-equivalent, the Health and Safety Executive, 85% were attributed to failure to set the proper specification, changes after commissioning, installation and commissioning, and design and implementation, while only 15% were associated with operation and maintenance errors [2]. Attributing errors to equipment failure (Bar graph p. 47) may hide their real source.

Reducing human error

In dealing with human error, one can take one of the following approaches:
  • Prevention: With this approach, errors can be viewed in different contexts: either considered from the outset, when they occur, or only when they create consequences.

In the first context, errors are prevented only if they are not made at all. An obviously efficient, if difficult, way to keep errors out of a system, it demands using a "Do it right the first time, every time" approach. This is a front-end process. Highly motivated, competent people are required to implement this type of approach, and to reduce human-factors-facilitated errors.

In the second context, it is assumed that errors will enter the system, but that each one should be caught before it can have a negative effect. This is a back-end process and is less efficient than the first option. Review and supervision processes are the key to reduce this type of error. Unfortunately, these processes are often somewhat informal, have no organized methodology in reducing errors, and seldom consider human factors.

  • Anticipation: A potential error is identified and the opportunity for the error to arise is minimized or eliminated.

Some examples of anticipation in action are: revising an overly complex procedure to a simpler one; creating a procedure to control safety system bypasses to assure that a bypass is not inadvertently left engaged; and placing an interlock to prevent an operator from taking an action unless some condition is satisfied.

"Some errors can be prevented by better training, or increased supervision, but the most effective action we can take is to design our plants and methods so as to reduce the opportunities for error or minimize their effects," notes safety authority Trevor Kletz [3].

  • Tolerance: This involves a situation in which errors are expected, but the system is designed to tolerate them. An example could be an operator prompt to verify any number entered -- the system assumes that the operator may enter a wrong number into a human-machine interface.
  • Mitigation: When this approach is taken, systems are put in place to mitigate the effects of an error. An example would be building a dike around a process vessel to contain liquid from overfilling
  • Lifecycle approach: Taking a systems approach to design, examining engineering and administrative controls, training and human factors.

The lifecycle approach is particularly useful for reducing human errors in instrumentation systems. With this approach, there is a formal lifecycle for design, installation, operation, and maintenance. This type of approach can use all the methods above to reduce or minimize human error, but formalizes their use. An example of this approach to a system is given in ISA 84.01, "Application of Safety Instrumented Systems for the Process Industry."

Cost and probability of errors

To quantify the probability of human error, we somehow must quantify the propensity of humans to make errors under the conditions of interest. Since we are dealing with the complexity of human actions, this is somewhat difficult. However, methods have been developed, including:
  • Human Error Assessment and Reduction Technique (HEART)
  • Technique for Human Error Rate Prediction (THERP), and
  • Empirical Technique to Estimate Operator Errors (TESEO).

A discussion of these methods can be found in Reference 4.

HEART, developed by J.C. Williams in the early 1980s, quantifies human error into a probability of the error, an error-producing condition multiplier, and a proportioning effect. The first two are provided in tables while the proportioning effect is determined by the person doing the analyses.

The role that automatic protection, or error-tolerant systems, should play in reducing human error remains a question. Some reports [5] indicate that inducing automatic protections has actually raised the amount of human error. One conclusion is that, with known automatic protections in place, operators may be prone to more risk taking, either individually or in their operating philosophy. If true, this merits close evaluation of the human factors involved.

In conclusion, human error occurs all the time. People are yelled at, chastised, criticized, but many companies today have no systematic method for reducing error. Don't assume that normal management or supervisory systems will solve the problem. Indeed, they may create human factors that facilitate error.

William L. (Bill) Mostia Jr., PE, of safety consultants Exida has more than 25 years experience applying safety, instrumentation, and control systems in process facilities.

Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments