1660238357887 Dirk

Quell Quality Control Quagmires

June 5, 2018
Understand the limits of possible improvements and when enough is enough

The work of the firm’s eight quality engineers was splashed on the walls of the room: charts, graphs, tables, flow charts, diagrams on napkins, and whatever else someone thought would make sense of the process. Unfortunately, there wasn’t much to show for a month of work. The client was getting antsy —it had approved the hours and paid the invoices but didn’t see any meaningful results.

[pullquote]
As an independent contractor, I had completed several debottlenecking studies prior to the arrival of the engineers and finished my work a couple of weeks after they started.

I wish I knew how things turned out. However, like most experienced engineers, I would reckon the efforts didn’t go well. Applying quality control to batch processes is a Sisyphean (i.e., endless and ineffective) task. It’s nearly impossible for a process that’s in development or a state of flux like this one; the company was changing process procedures almost daily to tweak quality.

What you must understand about quality control, especially with batch processes, is to focus first on your ingredients, not the product. Techniques like statistical process control (SPC) deal with random error, not systemic errors, for stable models.

Many engineers make the mistake of charging ahead under the delusion that more data points eventually will “define” the process. The data certainly won’t if they’re from vastly different samples!

Assuming a process is stable, that is, “in control,” you can monitor the average, i.e., X-bar in SPC lingo. However, if there’s a pattern or more than four points in sequence fall on one side of the average, then a systemic error exists in the process and SPC won’t do you any good at monitoring quality.

A manufacturing engineer once asked me if we could set up quality controls for an acid regeneration plant. I charted several weeks of data and concluded the process was “out of control,” which means measures like averages and variances are useless. The regenerated acid came from many mills, each one run differently.

In the end, all we could do was set up trip points to warn when spikes in contaminants needed attention. Then, the site would have to remove the contaminants itself or pay for their elimination.

The quality of the samples is a key factor, of course. Good sampling technique is important whether the process is in control or not. Whole books have focused on sampling but the gist of them is that you must learn how to sample from some screening experiments.

It’s also worth pointing out that instruments seldom are accurate to more than ½% of full scale. Some measurements, such as those for pH and conductivity, usually have larger margins of error. These unavoidable ranges contribute to process variance and operator error. I doubt if many continuous processes, let alone batch ones, could meet Four Sigma quality (i.e., a 0.6% error rate),much less Six Sigma (0.0003%).

According to the “Six Sigma Handbook,” most companies operate at Three Sigma (i.e., 2.5% errors) while startups typically incur a 15–20% error rate. I think those values are overly optimistic for firms running batch processes and for startup operations.

Let’s not debate about percentages. Instead, let’s turn to a couple of more important questions: Will applying all this brain power to solve a few key problems actually improve a company’s bottom line? Where is the breakeven point between resources expended and better financial results? (This is akin to the analysis that should take place for reliability work.) These often are difficult questions to answer until you go further down the rabbit hole. This topic was discussed in an unsatisfactory way in Mary Walton’s “The Deming Management Method.” Unfortunately, very few managers have the willpower to pull the plug on winning programs even once they’ve reached their full potential.

It’s almost an inverse version of the gambler’s (or Monte Carlo) fallacy. That’s where a gambler thinks that repeated losses actually improve the chance of eventually winning. In the inverse, a company assumes it can pick more low-hanging fruit although the tree is bare.

DIRK WILLARD is a Chemical Processing contributing editor. He recently won recognition for his Field Notes column from theASBPE. Chemical Processing is proud to have him on board. You can e-mail him at [email protected]
About the Author

Dirk Willard | Contributing Editor

DIRK WILLARD is a Chemical Processing Contributing Editor.

Sponsored Recommendations

Keys to Improving Safety in Chemical Processes (PDF)

Many facilities handle dangerous processes and products on a daily basis. Keeping everything under control demands well-trained people working with the best equipment.

Get Hands-On Training in Emerson's Interactive Plant Environment

Enhance the training experience and increase retention by training hands-on in Emerson's Interactive Plant Environment. Build skills here so you have them where and when it matters...

Rosemount™ 625IR Fixed Gas Detector (Video)

See how Rosemount™ 625IR Fixed Gas Detector helps keep workers safe with ultra-fast response times to detect hydrocarbon gases before they can create dangerous situations.

Micro Motion 4700 Coriolis Configurable Inputs and Outputs Transmitter

The Micro Motion 4700 Coriolis Transmitter offers a compact C1D1 (Zone 1) housing. Bluetooth and Smart Meter Verification are available.