Soft sensors — known also as software sensors or neural-network-based inferential calculators — are operators’ and engineers’ virtual eyes and ears. “Software sensors create windows to your process where physical equivalents are unrealistic or even impossible,” explains Jan Versteeg, process control consultant at SABIC-Europe’s Manufacturing Competence Center in Geelen, The Netherlands. The sensors also enable more frequent measurements than possible with hardware, adds Warrington, U.K.-based Paul Turner, Aspen Technology Inc., Cambridge, Mass.
These inferential calculators have broad applicability, says Don Morrison, product marketing manager with Honeywell Process Solutions, Phoenix, Ariz. For example, refining applications range from simple tasks like finding Reid vapor pressure to complex ones such as determining distillation cutpoint or percent light key in heavy product — and everything in between, he notes.
The sensors also predict product quality or combinations, adds Gail Powley, director of strategic initiatives for Matrikon Inc., Edmonton, Alta. The number of inputs may range from a few to hundreds. Sometimes, the input may be just one key parameter, such as distillation tower temperature, she notes.
Outputs typically find one of two principal uses: as open-loop advisory information for operators, or as inputs to model predictive controllers (MPCs) or adaptive controllers for closed-loop control.
“Ten years ago, it was 80% prediction (open loop) and 20% closed loop,” says Ric Snyder, senior product manager for Pavilion Technologies Inc., Austin, Texas, in describing how its end-users deployed their sensors. “Now it’s 80% closed loop and 20% prediction.” He says that as more people became comfortable with the soft sensor, the logical question became: “Why can’t I use this measurement like any other?”
Today, there’s also increasing interest in using the sensors with model-free adaptive controllers and intrinsically safe, nonlinear models. The sensors also promise to play a role in providing multivariable outputs.
Before any soft sensor can be deployed, however, users must address the quality of data to build it. That requires data cleaning — and such preprocessing is a major need, suggests Dave Shook, Matrikon’s chief technology officer. “When building a soft sensor, it’s important that you get all the corrupt data out.”
Most plants have automated data historians that collect and then usually compress data. “[But] compression can be a very dangerous obstacle,” explains Morrison. “You need to know where your data came from and what’s been done to it before you start working with it.”
The compressed data can help to identify the major variables associated with a process. “[However,] since this history is often used for near-real-time diagnostics and troubleshooting, it cannot be artificially filtered or compressed since that can hide dynamic behavior that might be important. We really want and advise customers to give us uncompressed data,” Snyder stresses, adding that no technology exists to unfilter a stream of data.
Regardless, users must specify which data will be used to develop the soft sensor — and those data must be accurate, says Morrison.
“You train the model and then use your test data to see if it accurately replicates the process,” Snyder notes. He cautions that data used to construct the model must never be used to train the model.
In addition, end-users need offline verifications, such as analyzer data or laboratory results, to ensure that the virtual sensor accurately tracks the real process. What’s at stake is the sensor’s credibility. “If the operators ever get to the point where they believe the model is no better than their own best guess, then they’ve lost confidence in the model and probably won’t use it anymore,” Snyder says.
Camcari, Brazil-based polyethylene manufacturer Politeno (Figure 1) put in considerable effort to ensure the credibility of its Pavilion soft sensors. “We had to collect data during a long period (before building the sensors), because we produce 60 grades from the lines,” says Jean-Claude Cailleaux, the facility’s technical manager. The company has two sensors on each of two low-density polyethylene (LDPE) lines for open-loop control, and two on its linear-low/high-density line for closed-loop control. The LDPE sensors predict melt index; the closed-loop soft sensors predict that and polyethylene density.
|Figure 1. Polyethylene plant in Brazil gains over 4,000 metric tons of additional output because of easier grade transitions.
The company’s efforts paid off. “We have reduced the (product) variability by more than 20%,” Cailleaux says. That makes shifting to different grades during production much easier and cuts losses. “We’ve reduced by 40% off-spec production during transitions,” he adds, noting that this translates to an additional 4,000 to 5,000 metric tons of saleable product each year.
The chemical industry historically has relied on the sensors mainly to provide valid, actionable and timely data. “Where soft sensors are most useful … is where the companies have analyzers installed and the cycle time is very long, relative to the frequency at which they want to control the process,” states Morrison.
Today, advances in soft sensor technology are extending measurement capabilities. For example, Worsley Alumina Pty. Ltd. (Figure 2), near Collie, Australia, now uses empirically based Honeywell soft sensors to measure Loss On Ignition (LOI), which is the amount of moisture in the final product from its calciners, says Angelo D’Agostino, senior process control engineer. This is the first use of empirically based soft sensors at the site, he explains, noting all previous soft sensors at Worsley were based on first principles. To date, however, a first principles-model of LOI has not been possible, he says.