Advanced Analytics Empowers Engineers

Operations benefit by gaining faster and easier insight from production data

By Michael Risse, Seeq Corporation

1 of 2 < 1 | 2 View on one page

Many chemical makers nowadays are all too familiar with the acronym DRIP, which stands for data rich, information poor (Figure 1). The data collection capabilities provided by process instrumentation coupled with improved methods of networking and storage have created an environment where companies accumulate vast amounts of time-series data from labs, suppliers and other sources. These data are stored in process historians, distributed in data silos, or lie “stranded” in comma separated values (CSVs) or the hardware system that created them. Together, they contain potential insights into the operation of virtually every major item of equipment and every important process in a typical chemical plant.

The challenge for many chemical makers has been converting data into information and insight for use by plant personnel to optimize operations.

Traditionally, gaining insight from production data has involved manual effort using a spreadsheet-based approach, where engineers and managers rely on their eyes, technical expertise and high-level spreadsheet programming skills to identify meaningful trends in data. However, the inability of spreadsheets to provide plant personnel with insight in a timely manner has prompted an increased number of facilities to turn to advanced analytics software that offers enhanced capabilities.

Unfortunately, full utilization of many of the “big data” solutions available today requires extensive programming expertise and knowledge of data science, to say nothing of information technology (IT) and other department costs to implement and manage. Further, the use of IT resources ultimately lengthens the time necessary to extract insight and hinders optimization efforts. Consequently, the demand for more-user-friendly applications to empower personnel without the need for them to possess advanced programming skills has grown immensely.

In this article, we’ll discuss specific examples where an advanced analytics offering has helped chemical processors accelerate their decision-making by quickly and simply converting operational data into insights that improve production outcomes.

Want to achieve greater insights, better decisions and more intelligent  operations overall. Embrace Industry 4.0!  REGISTER NOW

Improved Cycle Time

Cycle times in batch processes define product throughput. As a result, being able to detect deviation in cycle times of assets, identify changes in process variables from batch to batch, and quickly address root causes can play a key role in profitability by reducing cycle times and increasing throughput. However, doing so can be difficult due to low signal-to-noise ratios within production cycles.

Take, for example, steam consumption in a reactor. The time required to heat up a reactor depends upon the steam delivery system. In many cases, a single steam system handles multiple reactors and can become less efficient over time due to scale build-up and other factors. Operators don’t always know how much steam a reactor is using; therefore additional time to heat up can look normal and thus go unnoticed.

Analysis of batches over time can provide insight that leads to optimization. This requires comparing specific time intervals at various dates to isolate data to enable identifying and addressing underlying causes.

At one facility, heat-up of a particular phase of a reaction from 25°C to 45°C was taking longer than normal. To analyze the data and identify why this was occurring, isolating the steam addition phase was necessary. This began by opening a steam valve, after a certain amount of batch time had progressed and with the reactor around 30% of capacity, at a starting temperature of about 25°C.

The capsules feature in Seeq advanced analytics software (Figure 2) enabled quick and easy retrieval of reactor data during the heat-up from 25°C to 45°C. Control personnel could measure the time the reactor took to heat up while simultaneously looking at steam consumption.

Analysis showed that from one batch to another, the heat-up time doubled from 15 to 30 minutes but steam consumption didn’t change. This ruled out scale build-up as the cause of the problem. Further investigation revealed a failing control valve as the root cause. After the valve was replaced, heat-up time returned to its normal duration of 15 minutes.

Because steam consumption didn’t change from batch to batch, replacing the valve didn’t result in energy savings. However, shortening the time required for reactor heat-up boosted throughput.

1 of 2 < 1 | 2 View on one page
Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments