Avoid Errors - Part 2

Part 2: The Human Factor and System Errors

By Bill Mostia Jr., Exida

Share Print Related RSS
Page 1 of 2 « Prev 1 | 2 View on one page

Take a systems approach, analyzing human factors, ergonomics, culture, facility design and management systems

In September's issue, Part 1 discussed how people make errors and some ways to minimize these errors in process plant design, instrumentation and operations. This final installment examines human factors in greater depth, and outlines different ways to use them to reduce the likelihood of errors.

When you read in the newspaper that an airplane crash occurred because of "pilot error," or when you hear that a process plant made off-spec product because of a miscue by an operator, ask yourself: "Were these accidents really caused by human error, or were system errors at fault?" In system errors, human error is only part of a much larger problem. Typically, flawed systems will fail to take "human factors" into account. Yet, few companies today take a systematic approach to minimizing human error (bar chart).

To analyze human errors, one must apply relevant information about human behavior and characteristics. One must also define human characteristics that can be applied to people who are interacting with systems and devices. Both of these concepts are included in "human factors."

Some human factors are based on a person's physical and mental limitations or inherent behavior. Others can be based on psychological or sociological factors. Human-to-human interaction is based on psychology, while sociological factors such as group dynamics can be culturally or ethnically based.

Culture-based human factors can stem from the local culture, for example, of a plant or area, or may be ethnic- or society-based.

 

Systemic error-reduction is needed

A recent survey of companies in the U.S. and Japan found that few are taking a systemic approach to reducing human error, and even fewer are using novel techniques such as Poka Yoke, a technique developed in Japan that is widely used by the world's discrete manufacturing industries.
Source: Chao, Survey of 50 U.S. and Japanese Companies, cited in "Traditional Methods to Control Human Risk," NASA, 9/18/2002, p. 19.

 

Each facility has its own way of doing things or responding to change or other stimuli. One cultural problem is the "not invented here" syndrome, in which people take no interest in "best practices" outside of their organization or have no curiosity about how other people do things. Another is the "it can't happen here" syndrome, in which people are extremely resistant to change.

An ethnic- or society-based culture reflects the norms for the ethnic or society group. A cultural human factor, for example, might be the way people read. English is read from left to right while some other languages, such as Arabic, are read from right to left. Another such cultural factor is that some societies prefer group consensus over individual action.

Human factors also can be situational, depending on the way people interact with a particular situation or set of conditions. For example, one plant might arrange process equipment for a particular operation in one way, while another might arrange it differently. Similarly, one plant might assign a large group for a specific operation, and another, a small number of people. These differences would affect the way that each plant reacted to a particular situation.

Not all human factors are bad

Some human factors are good, such as those that may minimize errors or improve performance.

Others, however, can cause errors. Consider:

  • management systems (communication, training, scheduling, culture, style, work load, etc.),
  • procedures (response to upset, operational procedures, plant practices, etc.),
  • physical factors (ergonomics),
  • organization (presentation, order, structure, etc.), and
  • facility design (equipment, controls, environment, etc.) [1].

Also bear in mind how people process information. Ask yourself the following:

  • How much information can a person process at a time?
  • How quickly can a human being process information?
  • What role do short-term and long-term memory play?
  • How do people handle complex situations?
  • What role does individual mindset play?
  • How do human interactions or "group think" affect behavior?

Human factors exist everywhere in the lifecycle of a process plant or instrument system. Anything that can cause difficulties during implementation, operation, and maintenance can lead to human factor-facilitated errors.

If you ignore how people really work and think when you're developing design, operation, and maintenance procedures, practices, and systems, you're only facilitating errors and poor performance. Examples would be failing to do upfront engineering and design work properly for a new process control system, having a poor change-management system, supervising operators inadequately, using overly complex instrument operation or work procedures, or placing an instrument in a location where it difficult to work on.

Facilities or organizations are often guilty of "scapegoat syndrome," blaming one individual's errors for accidents while ignoring underlying human factors. This is taking the easy way out, and won't improve safety or quality.

 

Mechanical failure doesn't tell the whole story

 

Extracted from four separate EPA databases, these data on chemical plant accidents show the respective roles played by mechanical failure and human error as an initiating event. Note that mechanical failure is almost twice as likely as human error to initiate an accident. However, these data were compiled before detailed analyses had been completed. Consider system errors or deficiencies in human factors that may have led to the mechanical failure.

Page 1 of 2 « Prev 1 | 2 View on one page
Share Print Reprints Permissions

What are your comments?

You cannot post comments until you have logged in. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments