The role of the analytical chemist in today’s chemical industry is pretty much what it has always been — checking the quality of the final and intermediate products, and providing feedback to the plant so its operators can ensure that quality. But, as has happened to so many other professions, that role has become increasingly automated as traditional laboratory-based analytical techniques are moved online and even replaced by the virtual analysis offered by soft sensors and inferential measurements.
In many sectors such as refining and petrochemicals, online process analysis is the norm. Their high-throughput, low-margin operations can't afford the delays inherent in waiting for laboratory results to reveal whether or not a process is producing to specification. But other sectors, particularly pharmaceuticals, are relatively recent converts to the concept of moving the lab out into the plant. It was, after all, only three years ago that the U.S. Food and Drug Administration issued its guidance on Process Analytical Technology (PAT), "A framework for innovative pharmaceutical development, manufacturing, and quality assurance."
"The aim of PAT is to focus on the process rather than the end product. Processes are actively managed to achieve a high degree of repeatability and efficiency, and quality assurance becomes a continuous and real-time activity" (Figure 1), says Thomas Buijs of ABB Analytical, Québec, Canada.
Putting it all together
Buijs is line manager for development of a systems approach to PAT at ABB Analytical. Such an approach is crucial to PAT success, believes industry analyst Paula Hollywood of the ARC Advisory Group, Dedham, Mass. "Selecting and purchasing an analytical instrument for an in-process application the same way you would for a laboratory-based instrument is a recipe for failure," she says. "For pharmaceutical manufacturers struggling with what to do about the FDA's PAT initiative, early engagement with a strong engineered solutions supplier can bring much more to the process than standalone analytical devices."
In collaboration with one of its leading pharmaceutical industry customers, ABB this year unveiled its IndustrialIT for PAT solution, which Buijs describes as "a product, a solution, and a service." In essence, the product takes in data from online (and offline where necessary) analytical instruments, consolidates the data and provides information to the process control system for feedback control.
"The main PAT problem," Buijs says, "is a lack of interoperability between third party analyzers." ABB is addressing this problem by working with other vendors to enable their analytical data to be seamlessly linked, via OPC, to the FTSW800 software suite at the heart of the PAT product. This software supervises real-time spectra acquisition and property determination and supports the data processing algorithms. All the acquired data are stored in a single distributed database and the system can handle "huge" flows of both scalar and vector data coming from the analyzers.
Instrument vendors whose PAT analyzers can already link include Agilent (with its HPLC system), Mettler-Toledo (through its FBRM control interface software), Axsun (near infrared NIR analyzer), Ametek (mass spectrometer), Bruker (FT-NIR analyzer) and Zeiss (UV-Vis and NIR spectrometers).
Based on ABB's 800xA automation technology, IIT for PAT provides a local interface for data trending and operator interaction, as well as full connectivity to the plant's DCS or PLC control system. The service aspect, says Buijs, stems from the company's ability to look at the whole process, not just individual unit operations, and optimize the overall control strategy.
Siemens, Alpharetta, Ga., is another automation company leveraging its process control knowledge to encourage the uptake of PAT. Working in collaboration with analytical company Applikon Biotechnology, Schiedam, Netherlands, and the state-run Netherlands Vaccine Institute (NVI), Bilthoven, Netherlands, Siemens has developed its Sipat software to help with the implementation of PAT principles. The software brings together all the information flows during processing (in the development case, the cultivation of the Bordetella pertussis bacterium, a critical step in the manufacture of whooping cough vaccine) and enables online comparison of process and historical data. "The outcome is what we believe to be the very first automation system capable of developing and executing full PAT cultivation processes," says NVI project leader Mathieu Streefland.
Although software may be key to the success of a PAT implementation, the FDA initiative has given almost as big a boost to analytical hardware developments in recent years — and not just for pharmaceutical applications. For instance, Foss NIRSystems, Laurel, Md., says that, as a PAT tool, its XDS NIR analyzer is "the next generation of dedicated NIR technology for analyzing solid and liquid chemical and pharmaceutical formulations." In the form of the Foss Process Analytics MicroBundle single-point system, the technology also is said to offer an economical way of performing online remote analyses in hazardous environments.
The IntegraSpec XL NIR spectroscopy platform from Axsun Technologies, Billerica, Mass., is equally at home in a rugged in-line process environment or, in its XLP format, in tightly regulated PAT applications. Both versions are board-level platforms — complete with tunable-laser-based light source, integrated wavelength and amplitude references, and detector — that can be packaged and supplied through third-party analytical instrument manufacturers, systems integrators and process control companies. The XLP also includes comprehensive documentation to support Good Manufacturing Practice (GMP) regulatory compliance for pharmaceutical end-users.
The burgeoning biofuels sector (see Biofeedstocks see real growth) also stands to gain immediate benefits from being able to monitor product quality online. Manufacturers and buyers alike require the quality of the biofuels to meet recommended American Society for Testing and Materials (ASTM) specifications. With ASTM-certified analysis costing up to $1,200 per sample, plants want to know their output satisfies specifications before sending product for certification. To meet this new demand, Aspectrics, Pleasanton, Calif., has extended the capabilities of its MultiComponent 2750 EP-NIR analyzer to provide pass/fail information on multiple biofuel contaminants before samples are sent out for ASTM certification.
Based on Aspectrics' patented encoded photometric NIR spectroscopy technology, which mimics FTIR (Fourier Transform infrared) spectroscopy but in a more robust online form, the 2750 online biofuels analyzer can monitor methanol, water and total glycerin in biodiesel, water content in bioethanol, biodiesel blends and ethanol/gasoline blends. A single analyzer can process multiple samples at 100 scans/second, generating real-time results in a few seconds.
Besides the obvious necessity for rugged designs, taking analytical procedures out of the lab and into the plant often requires getting the equipment certified for use in the relevant process environment. A recent example was the September launch of the X-STREAM flameproof process gas analyzer by the Solon, Ohio-based Rosemount Analytical unit of Emerson Process Management (Figure 2).
Certified for use in Class 1, Zone 1, Group IIB + H2 hazardous areas, this unit is the latest addition to the X-STREAM series introduced in 2006, which offers single- and dual-channel analysis based on NDIR/UV/VIS (non-dispersive infrared, ultraviolet and visible) photometry, paramagnetic and electrochemical oxygen, and thermal conductivity sensor technologies.
Not a pipe dream
There's progress not only in analysis of streams but also in online monitoring of the condition of the piping through which they flow. In September, Emerson announced that it has teamed up with Rohrback Cosasco Systems, Santa Fe Springs, Calif., to introduce the MCS Microcor wireless transmitter for high-speed communication of corrosion rate data from the plant to its automation system. Earlier, in June, Honeywell Process Solutions, Phoenix, Ariz., launched its OneWireless mesh network solution, a range of products that includes a wireless corrosion monitor (Figure 3).
"This has turned our previous loop-powered digital HART corrosion transmitter [the SmartCET instrument] (see Innovative corrosion monitoring solutions enhance process optimization) into a battery-powered unit that now has a measurement cycle time of 30 seconds at its fastest speed," notes Dawn Eden, Houston-based marketing manager - corrosion.
Such response speed puts corrosion monitoring into the same league as the traditional process variables of temperature, pressure and flow — but begs the question "Is a near-real-time response really necessary for such a relatively long-term phenomenon as corrosion?"
The answer is an unequivocal "yes," says Eden, citing a couple of practical examples: "acid runaway situations where getting the concentration wrong can cause the diluted acid to tear through quite substantial thicknesses of piping in a matter of hours.... Similarly, if you're dealing with a process where product quality has to be absolutely right and you can't afford any contamination [from corrosion], then you want to have a very early warning of when something is going wrong."
Beyond the black box
However, many other types of online analysis remain either too expensive or simply impractical to implement. To handle such chores, the arrival around 15 years ago of a neural-network-based approach of soft sensors and inferential measurements seemed too good to be true. And, some might argue, so it has turned out. Others, though, see a widening role for inferential sensors as people gain more confidence in their capabilities.
"The key issue," says Arthur Kordon, data mining and modeling leader with Dow Chemical in Freeport, Texas, "is that data-driven solutions don't have the in-built credibility that you have with hardware solutions or models based on first principles. Another issue has proved to be the growing maintenance costs of inferential models based on a limited set of data, which means that you almost always have to extrapolate at some point — and unfortunately a neural network is not a technology that shines in extrapolation."
Those early "black box" types didn't just infer measurements, too often their sales pitches also inferred that all a plant had to do was supply sufficient empirical and historical process data for a model to be built that could predict product properties and provide input for closed loop control of the process.
Mike Brown, vice president solutions for Matrikon, Edmonton, Alberta, sums up those early days: "The problem was that people were pushing the technology so hard — exploiting the power of the CPU and all that availability of data — but doing it in a way that was ignoring basic engineering fundamentals. The whole concept of data mining was very powerful but the promise of the technology wasn't really delivering what clients were expecting. If you changed your feedstock, often your predictions were off but it took a while to realize this and retune the model. It was low cost to buy the technology — compared with buying the analytical hardware — but the support cost didn't go down significantly."
One of the early pioneers of soft sensors was Pavilion Technologies, Austin, Texas. Over the last two to three years the company has put a lot of effort into developing a hybrid approach to inferential sensors, says senior product manager Ric Snyder. "There was no perfect first principles model, and no perfect empirical model. Both have their pluses and minuses, but we have a lot of runtime now on the hybrid model and use it built into our existing MPC [model predictive control] solution. All of the chemical plants where we have our control software tend to have at least one or two inferential sensors running. It may be just for a purity measurement on a distillation column, estimating impurities online, or something like melt index in a polymer reactor or extruder — typically for a quality measure that you're actually selling against."
The hybrid model also has opened up other markets for Pavilion, most notably in the pharmaceuticals sector. For instance, by providing online measurements from fermentation processes, notes Snyder, Pavilion's model helps operators optimize their feed into the fermenter (Figure 4). "We've been able to reduce batch times by anywhere from 10% to 20%," he says.
Figure 4. Use of both empirical and first-principles models has speeded up fermentation batches. Source: Pavilion Technologies
While Pavilion's solutions are still essentially neural-network-based, Matrikon's ProcessMonitor product uses a partial least squares (PLS) algorithm to process its data. "The beauty of doing it this way," contends Brown, "is that the PLS structure we use is very good at providing a tremendous amount of insight into how the model you're developing ties back into the fundamental variables of the process that you understand."
Back at Dow, Kordon says two types of inferential sensors are in use around the company. Earlier ones are based on neural networks, such as those from Pavilion that have been used successfully on NOx emission control duties (see What’s in the air for continuous emissions monitoring? for more on predictive emissions monitoring systems). "It's not closed-loop control," he says, "but is generally accepted for the limits of the [emission] tests involved."
However, for other solutions for estimating properties and for closed loop control, Kordon says Dow "is gradually migrating to a genetic programming technology, especially its capability to derive a symbolic regression model — which looks not only at the accuracy of the data but its complexity as well, so we can find solutions that are relatively robust compared to neural nets."
This technology for the moment remains internal to Dow and the company filed patents for it earlier this year. However, Kordon believes more than technology might be needed to improve the acceptance of inferential sensors generally across industry.
"The key challenge is the psychology, the credibility, the risk-taking culture that's required," he says. "It's not really a technical issue — it's relatively mature — but it won't ‘turn your data into gold' as was claimed at the beginning. But it is a viable solution, and we're using it."