data-rich-information-poor-fig1
data-rich-information-poor-fig1
data-rich-information-poor-fig1
data-rich-information-poor-fig1
data-rich-information-poor-fig1

Advanced Analytics Empowers Engineers

Nov. 30, 2016
Operations benefit by gaining faster and easier insight from production data

Many chemical makers nowadays are all too familiar with the acronym DRIP, which stands for data rich, information poor (Figure 1). The data collection capabilities provided by process instrumentation coupled with improved methods of networking and storage have created an environment where companies accumulate vast amounts of time-series data from labs, suppliers and other sources. These data are stored in process historians, distributed in data silos, or lie “stranded” in comma separated values (CSVs) or the hardware system that created them. Together, they contain potential insights into the operation of virtually every major item of equipment and every important process in a typical chemical plant.

The challenge for many chemical makers has been converting data into information and insight for use by plant personnel to optimize operations.

Traditionally, gaining insight from production data has involved manual effort using a spreadsheet-based approach, where engineers and managers rely on their eyes, technical expertise and high-level spreadsheet programming skills to identify meaningful trends in data. However, the inability of spreadsheets to provide plant personnel with insight in a timely manner has prompted an increased number of facilities to turn to advanced analytics software that offers enhanced capabilities.

Data Rich Information Poor (DRIP)

Figure 1. Many chemical makers are drowning in data, with no clear way to identify useful information.

Unfortunately, full utilization of many of the “big data” solutions available today requires extensive programming expertise and knowledge of data science, to say nothing of information technology (IT) and other department costs to implement and manage. Further, the use of IT resources ultimately lengthens the time necessary to extract insight and hinders optimization efforts. Consequently, the demand for more-user-friendly applications to empower personnel without the need for them to possess advanced programming skills has grown immensely.

In this article, we’ll discuss specific examples where an advanced analytics offering has helped chemical processors accelerate their decision-making by quickly and simply converting operational data into insights that improve production outcomes.

[callToAction ]

Improved Cycle Time

Cycle times in batch processes define product throughput. As a result, being able to detect deviation in cycle times of assets, identify changes in process variables from batch to batch, and quickly address root causes can play a key role in profitability by reducing cycle times and increasing throughput. However, doing so can be difficult due to low signal-to-noise ratios within production cycles.

Take, for example, steam consumption in a reactor. The time required to heat up a reactor depends upon the steam delivery system. In many cases, a single steam system handles multiple reactors and can become less efficient over time due to scale build-up and other factors. Operators don’t always know how much steam a reactor is using; therefore additional time to heat up can look normal and thus go unnoticed.

Analysis of batches over time can provide insight that leads to optimization. This requires comparing specific time intervals at various dates to isolate data to enable identifying and addressing underlying causes.

At one facility, heat-up of a particular phase of a reaction from 25°C to 45°C was taking longer than normal. To analyze the data and identify why this was occurring, isolating the steam addition phase was necessary. This began by opening a steam valve, after a certain amount of batch time had progressed and with the reactor around 30% of capacity, at a starting temperature of about 25°C.

The capsules feature in Seeq advanced analytics software (Figure 2) enabled quick and easy retrieval of reactor data during the heat-up from 25°C to 45°C. Control personnel could measure the time the reactor took to heat up while simultaneously looking at steam consumption.

Analysis showed that from one batch to another, the heat-up time doubled from 15 to 30 minutes but steam consumption didn’t change. This ruled out scale build-up as the cause of the problem. Further investigation revealed a failing control valve as the root cause. After the valve was replaced, heat-up time returned to its normal duration of 15 minutes.

Because steam consumption didn’t change from batch to batch, replacing the valve didn’t result in energy savings. However, shortening the time required for reactor heat-up boosted throughput.

The plant operates 325 d/y. Reactor heat-up occurs twice per day. So, the site gained 162.5 hours of production time. This translates into approximately 15 extra batches annually, each with a value of $8,000, for a total financial gain of $120,000.

Capsule Data View

Figure 2. This eases identification of underlying trends and patterns in the time-series process data stored in historians.

Pattern Recognition

Detecting abnormal vibrations often is key to identifying and addressing problems in equipment before they become critical. However, in most instances, an alarm will trigger only after vibration exceeds a certain threshold. Searching for patterns that indicate an imminent failure can become very difficult when operators must look at trends and reports for a variety of different processes and equipment. The use of advanced analytics software like Seeq provides tools to identify patterns and abnormalities.

For example, reactors often have mixers with agitator blades. Bolts attach the blades to the agitator shaft, which itself connects to a gearbox on top of the reactor. The agitator runs at different speeds depending upon the specific product being manufactured and the liquid level in the reactor. Excessive vibration during mixing can cause a number of problems leading to higher maintenance costs and significant losses in throughput.

One processor took advantage of Seeq’s capsules feature to look at vibration-related data over time so underlying patterns could be identified. A baseline was set for vibrations at certain reactor levels and agitation speeds. Then, comparing this baseline pattern to each individual time capsule to check for the same conditions enabled detection of vibration deviations before the alarm threshold was crossed.

In addition to providing critical insight as to when to conduct maintenance, the software’s analytical capabilities helped identify specific problems such as loose blades, wobbling of the agitator shaft and gearbox issues. In contrast, the conventional alarm system simply alerted personnel to a problem but provided no specific insight as to its root cause.

Moreover, by being able to adopt a more-proactive approach to maintenance on the reactor, the plant reduced the number of breakdowns and improved the reliability of the agitator. This was critical because a breakdown during mixing could result in significant issues with batch quality.

Important Link

Determining how operational data correlate with end-product quality offers substantial advantages from both optimization and profitability standpoints. However, in most cases, quality data are stored in spreadsheets separate from process data; this can make linking the two data sets and discovering insights a very difficult task.

Many facilities annotate operational data — but this isn’t particularly effective for analyzing process upsets producing deviations in end-product quality. In these instances, the use of an advanced analytics application can provide significant benefits.

This was evident at a facility that creates a product via a multi-step process. Product leaves a reactor and goes to presses to squeeze out water and remove contaminants such as salts. The pressed material is bagged and, at a later time, dried into a powder.

Data associated with the quality of the dried product are stored in a spreadsheet. Because drying can take place as long as a week after the product has been pressed out, linking the lot number of the powder with the lot number of the particular chemical reaction is crucial.

Once the link is made, operators visually can compare all critical-to-quality (CTQ) parameters including color, concentration, purity and opacity. Each quality measurement is graded on a zero-to-five scale, with zero indicating no difference from the standard and five indicating the worst possible quality.

Four quality calls of zero indicated a perfectly produced batch, which served as a baseline for CTQ parameters. Any deviation from the baseline would stem from issues during the chemical reaction (pH, temperature, agitation, concentration or time), squeezing and washing (wash time or conductivity) or drying (temperature, drying time or belt speed).

Using the quality calls, plant personnel were able to visually compare time capsules in Seeq with the baseline capsules of the CTQs — and then identify links between quality calls and specific CTQ parameters. This was particularly beneficial because it gave personnel a better understanding of what conditions were causing quality deviations.

Over time, patterns identified helped plant personnel predict the quality of batches by providing insight into which specific production phases needed more precise control to avoid problems.

The drying step exemplifies the value of knowing such patterns. During this phase, the product is exposed to high temperatures that can alter its color. Standard drying temperature was approximately 125°C; a lower temperature made the product darker while a higher one made it lighter. The patterns uncovered in Seeq were used to predict the color call. With this information, plant personnel now utilize blending to reach a desired color without impacting other CTQ parameters critical to overall quality.

Empowering Personnel

Two common themes emerge in these uses of advanced analytics. First, engineers and production employees with expertise in assets and processes were able to investigate and discover insights on their own. In other words, the analytics software didn’t create a dependency on the IT department, nor did it require specialized skills to uncover these insights. Second, analyzing each scenario took hours, not weeks or months; thus, improvements could be made in near real-time to boost business outcomes.

These themes — enabling engineers and quick time to improvements — are the foundation of the modern approach to investigating production data and discovering insights. Advanced analytics software must fit the user’s capabilities and enable faster impact.

In the coming years, as chemical makers continue to collect vast amounts of process data, the need for more-user-friendly applications to empower employees to make more-intelligent optimization decisions will continue to grow. These applications will play a critical role in tackling the DRIP problem by providing visibility into patterns and trends. This increased insight then can be used to predict production outcomes, improve end-product quality and raise profitability.

MICHAEL RISSE is a vice president of Seeq Corp., Seattle. Email him at [email protected].

Sponsored Recommendations

Connect with an Expert!

Our measurement instrumentation experts are available for real-time conversations.

Maximize Green Hydrogen Production with Advanced Instrumentation

Discover the secrets to achieving maximum production output, ensuring safety, and optimizing profitability through advanced PEM electrolysis.

5 Ways to Improve Green Hydrogen Production Using Measurement Technologies

Watch our video to learn how measurement solutions can help solve green hydrogen production challenges today!

How to Solve Green Hydrogen Challenges with Measurement Technologies

Learn How Emerson's Measurement Technologies Tackle Renewable Hydrogen Challenges with Michael Machuca.