dirk

Quell Quality Control Quagmires

June 5, 2018
Understand the limits of possible improvements and when enough is enough

The work of the firm’s eight quality engineers was splashed on the walls of the room: charts, graphs, tables, flow charts, diagrams on napkins, and whatever else someone thought would make sense of the process. Unfortunately, there wasn’t much to show for a month of work. The client was getting antsy —it had approved the hours and paid the invoices but didn’t see any meaningful results.

[pullquote]
As an independent contractor, I had completed several debottlenecking studies prior to the arrival of the engineers and finished my work a couple of weeks after they started.

I wish I knew how things turned out. However, like most experienced engineers, I would reckon the efforts didn’t go well. Applying quality control to batch processes is a Sisyphean (i.e., endless and ineffective) task. It’s nearly impossible for a process that’s in development or a state of flux like this one; the company was changing process procedures almost daily to tweak quality.

What you must understand about quality control, especially with batch processes, is to focus first on your ingredients, not the product. Techniques like statistical process control (SPC) deal with random error, not systemic errors, for stable models.

Many engineers make the mistake of charging ahead under the delusion that more data points eventually will “define” the process. The data certainly won’t if they’re from vastly different samples!

Assuming a process is stable, that is, “in control,” you can monitor the average, i.e., X-bar in SPC lingo. However, if there’s a pattern or more than four points in sequence fall on one side of the average, then a systemic error exists in the process and SPC won’t do you any good at monitoring quality.

A manufacturing engineer once asked me if we could set up quality controls for an acid regeneration plant. I charted several weeks of data and concluded the process was “out of control,” which means measures like averages and variances are useless. The regenerated acid came from many mills, each one run differently.

In the end, all we could do was set up trip points to warn when spikes in contaminants needed attention. Then, the site would have to remove the contaminants itself or pay for their elimination.

The quality of the samples is a key factor, of course. Good sampling technique is important whether the process is in control or not. Whole books have focused on sampling but the gist of them is that you must learn how to sample from some screening experiments.

It’s also worth pointing out that instruments seldom are accurate to more than ½% of full scale. Some measurements, such as those for pH and conductivity, usually have larger margins of error. These unavoidable ranges contribute to process variance and operator error. I doubt if many continuous processes, let alone batch ones, could meet Four Sigma quality (i.e., a 0.6% error rate),much less Six Sigma (0.0003%).

According to the “Six Sigma Handbook,” most companies operate at Three Sigma (i.e., 2.5% errors) while startups typically incur a 15–20% error rate. I think those values are overly optimistic for firms running batch processes and for startup operations.

Let’s not debate about percentages. Instead, let’s turn to a couple of more important questions: Will applying all this brain power to solve a few key problems actually improve a company’s bottom line? Where is the breakeven point between resources expended and better financial results? (This is akin to the analysis that should take place for reliability work.) These often are difficult questions to answer until you go further down the rabbit hole. This topic was discussed in an unsatisfactory way in Mary Walton’s “The Deming Management Method.” Unfortunately, very few managers have the willpower to pull the plug on winning programs even once they’ve reached their full potential.

It’s almost an inverse version of the gambler’s (or Monte Carlo) fallacy. That’s where a gambler thinks that repeated losses actually improve the chance of eventually winning. In the inverse, a company assumes it can pick more low-hanging fruit although the tree is bare.

DIRK WILLARD is a Chemical Processing contributing editor. He recently won recognition for his Field Notes column from theASBPE. Chemical Processing is proud to have him on board. You can e-mail him at [email protected]
About the Author

Dirk Willard | Contributing Editor

DIRK WILLARD is a Chemical Processing Contributing Editor.

Sponsored Recommendations

Heat Recovery: Turning Air Compressors into an Energy Source

More than just providing plant air, they're also a useful source of heat, energy savings, and sustainable operations.

Controls for Industrial Compressed Air Systems

Master controllers leverage the advantages of each type of compressor control and take air system operations and efficiency to new heights.

Discover Your Savings Potential with the Kaeser Toolbox

Discover your compressed air station savings potential today with our toolbox full of calculators that will help you determine how you can optimize your system!

The Art of Dryer Sizing

Read how to size compressed air dryers with these tips and simple calculations and correction factors from air system specialists.