Source: Mary Kay O'Connor Institute for Process Safety
Consideration of human factors and their effects should be part of the lifecycle of any process plant or control or instrumentation system. In fact, human factors should be considered in all designs, procedures, and practices as a value-added practice and in some cases a matter of law. OSHA regulation 29 CFR 1910.119, Process Safety Management (PSM), for example, requires that the process hazards analysis (PHA) address human factors.
Errors may be classified by whether they affect safety or not. Safety errors may result in an accident, a near miss or an accident waiting to happen. Safety errors caused by humans in safety instrumented systems (SIS) are called "systematic" errors. In an often-cited study of control and safety system failures by the U.K.'s OSHA-equivalent, the Health and Safety Executive, 85% were attributed to failure to set the proper specification, changes after commissioning, installation and commissioning, and design and implementation, while only 15% were associated with operation and maintenance errors . Attributing errors to equipment failure (Bar graph p. 47) may hide their real source.
Reducing human error
- Prevention: With this approach, errors can be viewed in different contexts: either considered from the outset, when they occur, or only when they create consequences.
In the first context, errors are prevented only if they are not made at all. An obviously efficient, if difficult, way to keep errors out of a system, it demands using a "Do it right the first time, every time" approach. This is a front-end process. Highly motivated, competent people are required to implement this type of approach, and to reduce human-factors-facilitated errors.
In the second context, it is assumed that errors will enter the system, but that each one should be caught before it can have a negative effect. This is a back-end process and is less efficient than the first option. Review and supervision processes are the key to reduce this type of error. Unfortunately, these processes are often somewhat informal, have no organized methodology in reducing errors, and seldom consider human factors.
- Anticipation: A potential error is identified and the opportunity for the error to arise is minimized or eliminated.
Some examples of anticipation in action are: revising an overly complex procedure to a simpler one; creating a procedure to control safety system bypasses to assure that a bypass is not inadvertently left engaged; and placing an interlock to prevent an operator from taking an action unless some condition is satisfied.
"Some errors can be prevented by better training, or increased supervision, but the most effective action we can take is to design our plants and methods so as to reduce the opportunities for error or minimize their effects," notes safety authority Trevor Kletz .
- Tolerance: This involves a situation in which errors are expected, but the system is designed to tolerate them. An example could be an operator prompt to verify any number entered -- the system assumes that the operator may enter a wrong number into a human-machine interface.
- Mitigation: When this approach is taken, systems are put in place to mitigate the effects of an error. An example would be building a dike around a process vessel to contain liquid from overfilling
- Lifecycle approach: Taking a systems approach to design, examining engineering and administrative controls, training and human factors.
The lifecycle approach is particularly useful for reducing human errors in instrumentation systems. With this approach, there is a formal lifecycle for design, installation, operation, and maintenance. This type of approach can use all the methods above to reduce or minimize human error, but formalizes their use. An example of this approach to a system is given in ISA 84.01, "Application of Safety Instrumented Systems for the Process Industry."
Cost and probability of errors
- Human Error Assessment and Reduction Technique (HEART)
- Technique for Human Error Rate Prediction (THERP), and
- Empirical Technique to Estimate Operator Errors (TESEO).
A discussion of these methods can be found in Reference 4.
HEART, developed by J.C. Williams in the early 1980s, quantifies human error into a probability of the error, an error-producing condition multiplier, and a proportioning effect. The first two are provided in tables while the proportioning effect is determined by the person doing the analyses.
The role that automatic protection, or error-tolerant systems, should play in reducing human error remains a question. Some reports  indicate that inducing automatic protections has actually raised the amount of human error. One conclusion is that, with known automatic protections in place, operators may be prone to more risk taking, either individually or in their operating philosophy. If true, this merits close evaluation of the human factors involved.
In conclusion, human error occurs all the time. People are yelled at, chastised, criticized, but many companies today have no systematic method for reducing error. Don't assume that normal management or supervisory systems will solve the problem. Indeed, they may create human factors that facilitate error.
William L. (Bill) Mostia Jr., PE, of safety consultants Exida has more than 25 years experience applying safety, instrumentation, and control systems in process facilities.