Human Factors Engineering: Designing Systems Around Our Limitations
Chernobyl, Bhopal, Three Mile Island, Deepwater Horizon, Texas City — What do they have in common? Human error or human factors were identified as contributing to the incidents. But what are these factors?
The discipline of human factors engineering integrates human characteristics into the design of human-machine systems. It seeks to account for how people actually perceive information, make decisions and interact with technology, rather than how we assume they do.
What makes us human? DNA is the most obvious answer, but humans share common characteristics beyond genetics. We process information in the same way, with definite limitations. We make decisions under stress in a similar way. We store and access information from our experiences and training in the same fashion. Our physical characteristics — reach, height, strength — generally are within a certain range.
Because of these shared traits, individuals shouldn’t assume they represent everyone. Judgments like “I can do that” or “It looks good to me” don’t cut it. Ironically, we’re terrible at analyzing our own performance. We struggle at understanding how we accomplish certain tasks or make difficult decisions. For example, people tend to estimate time inaccurately, often misjudging how long a task took. We also overestimate the probability of low-frequency events and underestimate the probability of high-frequency events.
This inability to know ourselves is why the field of human factors engineering developed.
Don’t Let Your Eyes Fool You
Before we move on, a quick aside...You probably think you’re seeing this article exactly as it is — unaltered. Not true. Close one eye. The text may shift, and you’ll likely see less. Yet you don’t see a gap or blank spot in the middle of the page, even though your retina has a point with no photoreceptors (where the optic nerve passes to the brain). Your brain quietly fills in the missing information, so what you “see” appears complete. Aha! Gotcha. Illusions like this happen all the time — and they’re exactly why human factors engineering matters.
One of the easiest human factors to address is anthropometry, which refers to human dimensions such as reach, height and body size. These measurements have been cataloged for decades, most notably by the Department of Defense in Military Standard MIL-STD-1472G, Human Engineering. What are the ideal dimensions of a workstation? They’re in the standard. How wide must a hallway be to allow two people to pass? That’s in there, too. How high can a valve be installed so that 90% of the female population can reach it? The standard covers that, too.
The harder human factors to address are those related to cognition — making decisions, taking action or choosing not to. How we perceive and process information is unique, with key limitations. We excel at associating meaning with patterns and colors. Our conscious processing has a bottleneck: it can handle only about seven chunks of information, though a single chunk can contain a huge amount of data. For example, the Dow Jones Industrial Average is a type of chunking for stock market performance. We don’t know the upper limit of what we can store in memory but accessing that information is easier in context.
Where and when do these cognitive factors come into play? Anytime information needs to be processed, which is essentially anytime an action must be taken. How the information is presented, whether on a screen, in a procedure or in a training guide, is the domain of human factors engineering. Human factors engineers consider what information needs to be processed and what constitutes visual noise. They examine how that information interacts with memory to support decision making, what options exist for taking action, and whether those options are consistent with the information provided.
One of the most common examples of poor human factors engineering is the range or stove in your kitchen. How often in your life have you put a pan on a burner only to come back later to find a different burner red-hot? The question is why we make that “error.” It’s because the burners are arranged in an array, while the controls are laid out linearly. Is the mistake yours or the designer’s for arranging linear controls for a 2x2 array? This is an example of what human factors engineers call design-induced error.
Minimizing the potential for “human error” requires an understanding of what it means to be human — and designing systems accordingly. Manual valves are not installed eight feet off the ground. Color is used judiciously. Process data is presented so it can be easily “chunked.” Procedures are simple and clear. Training is focused on tasks that must be performed, not just a compendium of facts.
OSHA’s process safety standard, 29 C.F.R. 1910.119(e)(3)(vi), specifies that process hazard analyses shall address human factors. The regulation doesn’t specify what human factors are or how to address them — hopefully, you now have some idea.
About the Author
David Strobhar
David Strobhar founded Beville Operator Performance Specialists in 1984. The company conducts human factors engineering analyses of plant modernization, operator workload, and alarm/display systems for BP, Phillips, Chevron, Shell and others. Strobhar was one of the founders of the Center for Operator Performance, a collaboration of operating companies, DCS suppliers and academia that researches human factors issues in process control. He is the author of "Human Factors in Process Plant Operations" (Momentum Press) and was the rationalization clause co-editor for ISA SP18.2, "Alarm Management for the Process Industries." Strobhar has a degree in human factors engineering, is a registered professional engineer in the state of Ohio and a fellow in the International Society of Automation.

