Podcast: 6 Training Myths that Sabotage Operator Performance
Key Highlights
1. Focus on component skills, not total tasks Training on isolated critical skills with targeted feedback produces better results than running full simulations repeatedly.
2. Train for "good enough," not perfection In high-workload situations, operators need to achieve acceptable, stable performance quickly rather than perfect optimization.
3. Conceptual knowledge doesn't equal operational proficiency Understanding how a system works theoretically doesn't mean operators can perform the task without hands-on practice and feedback.
In this episode, Traci Purdum and Dave Strohbar explore why traditional training approaches fail operators in chemical processing plants. They examine misconceptions about practice, simulator fidelity, motivation, accuracy versus acceptable performance, early assessment reliability, and the gap between theory and practical skills.
Transcript
Edited for clarity
Welcome to the operator training edition of Chemical Processing's Distilled podcast. This podcast and its transcript can be found at ChemicalProcessing.com. You can also download this podcast on your favorite player. I'm Traci Purdum, editor-in-chief of CP, and joining me is Dave Strohbar, founder and principal human factors engineer for Beville Engineering. Dave is also the founder of the Center for Operator Performance. Hey Dave, it's been a few months since we last spoke. What have you been working on in that time?
Dave: A lot of interesting things, Traci. There's a lot of interest in the industry regarding improving operator skills — getting skills better, faster. The emphasis never seems to go away.
Traci: Better and faster. It reminds me of the Six Million Dollar Man.
Dave: Yes, exactly.
Traci: Speaking of improving operations, you're my go-to subject matter expert on the human factors side of this, and you recently suggested we cover how training programs are based on implicitly assumed fallacies. You sent me a paper, "Training High-Performance Skills: Fallacies and Guidelines," which was published 40 years ago, but I guess the concepts and intel still stand today. Can you tell me a little bit about that paper and what some of these common fallacies are? Then we'll dig a little bit deeper into each one.
Dave: Sure, Traci. This paper is one of those classics. Like you said, it's 40 years old, and yet the lessons from it are as valid today as they were then. Unfortunately, they're still present today — they haven't gone away. In many ways, I think of these as myths that people have. One of my favorite quotes is from John F. Kennedy, and he said the greatest danger to the truth is not the lie, but the myth. People will hold onto these myths even though they're not correct. In some cases, they're counterproductive. Walt Schneider from the University of Pittsburgh did this paper, and he identified six fallacies — or what I would say six myths — that inhibit getting the best performance improvement from our training efforts.
Fallacy No. 1: Practice Makes Perfect
Traci: Let's talk about the first fallacy: Practice makes perfect.
Dave: Yeah, that's something everybody thinks about. There is a definite lack of practice in training programs across the industry. But practice makes perfect only works for simple tasks. When you get into complex tasks like controlling a process unit, you can practice a lot and may see little or no improvement, particularly if you're not practicing the right thing. Simply doing the same task over and over again may not show any improvement at all because with complex tasks, there's often something you need to be doing correctly in order to get that gain in performance. The simple act of practice isn't going to help you. You need to focus on what it is about this task that the person needs to be doing, not just say, "Hey, let's go over this time and time again." It's like a sports team running a play without a coach — they just keep running the play wrong because nobody's telling them, "No, you need to do it this way in order to get the performance improvement."
Fallacy No. 2: Training of the Total Skill
Traci: That's a great visual there. The second fallacy that we're going to talk about is training of the total skill.
Dave: Yes, this one — the author says that this fallacy seduces one to maximize fidelity even when it yields little training benefit. This is a myth that many companies have chosen, and it's reinforced by simulator vendors who say, "What you need is a high-fidelity, full replica simulator, and you just put them in there and have them control the process, and you will get a gain out of that." But what you find is that training the total skill often doesn't get you much benefit. What you want to do is look at what the critical component skills are that are involved in doing that and focus on those and provide feedback on those. If I'm training an operator to do a startup and I put them in the simulator and I just have them go through the whole startup — let's say it takes an hour for them to go through this effort to do the total skill — but it turns out that the critical component is introducing feed into the reactor, and the rest of it just goes along with it, they would be better off spending four 15-minute sessions on putting feed into the reactor and providing feedback on that particular component skill rather than training the total skill. This notion that all I have to do is buy the simulator and put them in it — you can get some benefit, but you're not going to get as much benefit as you would if you isolated that skill and had the training and the feedback focused on that component skill.
Traci: That's interesting, and obviously the important skill is what they need to learn. But there's so much extraneous stuff that maybe the important part of that skill is getting lost in that full hour rather than just being concentrated in that 15 minutes.
Dave: Oh, absolutely. What you find is that the other things will obscure that critical skill. The individual may be successful at the end of the hour, but they didn't gain as much as they could have gained. It's hard to see that this is the critical component and the rest of this is almost background noise.
Fallacy No. 3: Skill Learning is Intrinsically Enjoyable
Traci: That's a good way of putting it. The next fallacy kind of makes me laugh: Skill learning is intrinsically enjoyable.
Dave: Yeah. This sort of goes with the previous one in terms of what many plants do. There is this notion that all I had to do was take that high-fidelity simulator and just plop it down in the control room, and operators would go over on their free time and practice and do all sorts of things. What they found is, of course, they didn't do that. There was nothing particularly enjoyable about going in and running some simulations. Maybe some go-getter people would do that. But the idea was, "We'll put it in the control room and they will train." Of course, they didn't train because there wasn't really motivation. Yes, it might improve their job performance, and they want to do a good job, but going over and practicing upsets — that's not a real fun time for the operator. You've got to build in some motivation, like, "Let's see which crew can do the best" or something like that. But the "build it and they will come" — that's a total myth.
Traci: It reminds me of having a video game, and that would be enjoyable to do. But without that gaming — you said, "Let's see which team can do better" — that sounds like an interesting approach. Is gamification a no-no?
Dave: I could see some people not wanting to do the gamification, but it's interesting. I have been to places where there will be an exercise bike in the control room, and some of the places will have competitions between the crews as to who can put the most miles on the bike. It can be something very simple that provides that little extra, "Why am I doing this?" And I would hope that management wouldn't see it as, "Oh, they're playing a video game," but rather, "Hey, they're getting better at what I want them to get better at."
Fallacy No. 4: Train for Accurate Performance
Traci: The next fallacy is train for accurate performance.
Dave: Correct. There's a thought that we want these operators to be as good as possible, and so they train for this sort of maximum performance. What they need to be doing is training for acceptable performance. There's a terminology in decision-making and in crisis management called "satisficing." That is, you want the behavior that gets you the best solution the fastest. It may not be the ultimate, optimal solution, but it works and you can do it. If I'm training operators to balance reflux and reboil on a tower and get that just perfect, that may work if all I have to do is deal with one tower. But if I have a major upset and all my towers are bouncing around, you don't care anymore about the reflux and the reboil being properly balanced. You just want to get those towers back in a reasonable, stable state. Don't worry about perfection on some of these tasks — worry about getting good enough. That's going to come in really handy in these high-workload situations where I don't have time to spend 10-15 minutes tweaking these two parameters so I can get them into that perfect ratio. I want to stay on spec. If I'm using a little bit too much heat or not enough heat, or my energy balance isn't quite right, that can be corrected later. But get it into that stable, acceptable state and then worry about tuning it out.
Traci: I think that's a good life lesson — "good enough." I don't know if it's because I'm getting older and I realize that you don't have to reach perfection to be acceptable. So I think that is a good life lesson for everything.
Dave: Very much so.
Fallacy No. 5: Initial Performance is a Good Predictor of Success
Traci: The next fallacy is: Initial performance is a good predictor of trainee and training program success.
Dave: Yes. Some people will put somebody through into the training, and wow, they're doing really good at the beginning, and you're thinking this is going great. But what research has found is that initial performance is very unstable. Don't let them out because, "Oh, you scored very well at the beginning, so you're done," or "Wow, this is a great training program. We don't even need to take it through to its conclusion. We can just do this little first part of it." What research has shown is that a lot of times that initial performance drops off very dramatically, and you really need the whole exposure to the training to get that maximum performance. People get misled by some initial performance, thinking, "We don't even need this training. Look how good they're doing." Yet if you continued on, you'd find out that there are some issues that the trainee needs to be able to deal with.
Traci: Does this contradict the previous fallacy?
Dave: This is sort of more in terms of an addition to it. What they're doing initially, they may be very accurate and they're not doing that sort of heading for the acceptable performance. Here, what you're saying is you need to really work out the full range of skills so that you find out: Can they give you that acceptable performance throughout the activity? Compare it to an athlete. You may find that, "Oh wow, this guy runs a really fast 40 time," and you think, "We've got it. This is great." There's one football player that he was extremely fast, and the word on him was, "Yeah, but he can only run in a straight line." He couldn't do the routes that they were requiring. If you look at just that initial performance, you may be fooled into thinking, "Oh, we've really got something on our hands." But over time, you find out there are aspects of it that we need to address or we need to deal with. Not that they become perfect, but that they at least become acceptable.
Fallacy No. 6: Proficiency Will Develop
Traci: I always love your sports analogies, and I knew you were going to be able to combat my question there — it just popped into my head. So thank you for that. We're moving on to the sixth and final fallacy: Once the learner has a conceptual understanding of the system, proficiency will develop in the operational setting.
Dave: Yes. This is a problem in the industry, and it actually goes almost opposite of the first fallacy, "practice makes perfect." Here, they're saying you don't need to practice at all — that once you understand what's occurring, then, "Hey, I can perform the task." In the industry, there is a definite problem with inadequate practice of the task and feedback. There was an episode of "The Big Bang Theory," and they were in a car that broke down, and they raised a question: "Does anybody here understand how an internal combustion engine works?" Of course, all these Ph.D.s said, "Oh yes, yes, yes, yes." And then the next question was, "Does anybody here know how to repair an internal combustion engine?" And of course, none of them — "No, no, no, no, no." So this notion that, "OK, well, now that you understand conceptually what is occurring, you can just do that" — you can conceptually understand in a distillation tower the function of reflux and reboil. But that then doesn't enable you to do the task of controlling or balancing that tower out. OK, I know one's cooling and one's heating, but what is that interaction, and how does it interact? Because it may not be linear, too. You may have to learn that little trick that, "OK, at this point, adding more heat does me no good, or increasing cooling does me no good." So you need that practice even though you know conceptually what's going on. You need to have a way that you have taken that conceptual understanding and actually turned it into a motor response or a training task.
Traci: Dave, is there anything you want to add with all of these fallacies that maybe can bring it on home for us?
Dave: I think a lot of this gets back to what we talked about before: the systems approach to training. There are ways to combat these fallacies. Keeping an awareness — always — of what your assumptions are and what you're taking for granted, and constantly challenging those assumptions will produce improvements not just in training, but pretty much in anything. If you say, "Yeah, you're right about these things. These are myths that we're living by. Let's change" — hey, you're going to have to spend a little more time and effort, but your results are going to be so much better than if you just say, "I'm going to believe this myth, and now I'm happy and ignorance is bliss that my training program is doing well."
Traci: Well, ignorance may be bliss — we know what happens with that. Dave, I appreciate you being the operator training mythbuster, always bringing us good information and great analogies. Want to stay on top of operator training and performance? Subscribe to this free podcast via your favorite podcast platform to learn best practices and keen insights. You can also visit us at ChemicalProcessing.com for more tools and resources aimed at helping you achieve success. On behalf of Dave, I'm Traci, and this is Chemical Processing's Distilled Podcast: Operator Training Edition. Thanks for listening. Thanks again, Dave.
Dave: Thanks, Traci.
About the Author
Traci Purdum
Editor-in-Chief
Traci Purdum, an award-winning business journalist with extensive experience covering manufacturing and management issues, is a graduate of the Kent State University School of Journalism and Mass Communication, Kent, Ohio, and an alumnus of the Wharton Seminar for Business Journalists, Wharton School of Business, University of Pennsylvania, Philadelphia.