The adoption of edge technology — which involves processing data close to its source — poses cultural as well as technical challenges to chemical makers. However, the experiences of vendors such as GE Digital, Siemens and AspenTech show that operating companies can achieve multiple benefits if they fully understand the potential of the technology.
“The value of sensor data is well understood but the outcomes often differ throughout the value chain,” says Steve Pavlosky, principal product manager, GE Digital’s Proficy historian and data at the edge program, San Ramon, Calif.
For specialty chemical companies with a varied product mix, analyzing sensor data helps them improve product quality by looking at historical process trends and identifying optimal process conditions, he points out. In contrast, commodity/petrochemical manufacturers mainly rely on the data to monitor critical process equipment and detect equipment anomalies before they impact operations or safety.
However, notes Pavlovsky, chemical makers face a number of barriers to adopting the latest edge technology, including: old equipment without sensors; a lack of staff and dedicated resources and skills to effectively analyze sensor data; and an absence of proactive digital strategies for their operations. Moreover, he warns, some companies start data analytics projects without tracking business value — causing many of these pilot programs to lose sponsorship from key stakeholders.
“It’s interesting that moving data from edge to cloud is such a prevalent topic. Five years ago, I would get huge pushback when talking to (most) customers about moving data off premises. What’s happened during that time period is a combination of better standard technologies for moving data to the cloud and applications like GE’s APM and Manufacturing Data Cloud solutions which provide insights across multiple plants,” he explains.
The company’s recent release of Proficy Historian 8.0 specifically is focused at getting better information into the hands of every operator — the so-called democratization of data. It eases creating and maintaining an asset model, including mapping the vast number of tag names or data elements from a variety of business systems across chemical operations. A drag-and-drop HTML5 application development tool means anyone can create screens that combine data from different systems to spur better decision-making.
“We are headed towards a model which will allow any enterprise-wide user (with the right security credentials) to have visibility to data from any plant level historian, in context of the asset model coupled with IT [information technology] focused management tools to manage these distributed systems. These capabilities are extremely important to some of our specialty chemical customers, for example, who have thousands of servers collecting data at dedicated centers,” believes Pavlosky.
At the same time, hybrid data architectures will become the norm in the future, with the majority of data stored on premise and then a specific sub-set used by cloud-based applications, he reckons. Coupled with ongoing simplification of machine learning/artificial intelligence (ML/AI) analytics and more engineering graduates knowing how to code Python, the technology’s deployment is only going in one direction, he feels.
Japanese Success Story
A project carried out with Toray Plastics, Tokyo, a high-performance polymer film manufacturer, exemplifies the value already achievable by democratizing data, he notes. The company sought to monitor its processes more closely to ensure products meet the stringent quality standards demanded by the food industry.
Here, GE Plant Applications technology allowed the company to collect real-time data directly from edge devices and assets for critical key performance indicators as well as perform batch analyses to optimize operations.
“Plant Applications enabled operators to oversee manufacturing on a more granular level and reduce the production of defective film (first pass quality), which improved overall equipment effectiveness, quality and reduced material waste, thus helping to increase efficiencies and decrease costs,” says Pavlosky.
By implementing other GE Digital edge technologies, including iFIX Workflow, Toray Plastics used data-driven information to gain visibility into potential production interruptions and downtime.
“Toray Plastics also leveraged Historian to optimize asset performance through its data archive and reporting capabilities. The company further developed its by-the-numbers approach by creating a downtime dashboard — which tracked each line by shift, downtime percentage and cost of downtime — to better align plant floor metrics to executive-level goals,” he adds.
The Japanese company acknowledges significant savings in film manufacturing costs as a result of improved uptime and decreased defect rates. In now enjoys faster product traceability, too. Moreover, as more data get captured, it expects to further enhance correlation of asset, process and product information — with annual potential savings estimated in the hundreds of thousands of dollars.
Increased Insight
The chemical industry should leverage its existing control experience, counsels Alastair Orchard, vice president of digital enterprise for Siemens Digital Industries Software, Genoa, Italy.
“The technology may have changed, but the IIoT [Industrial Internet of Things], edge and cloud that are now considered so revolutionary and modern are not so far from the DCS [distributed control system] and dynamic matrix control solutions deployed in refineries and processing plants since the 1970s,” he says.
Even conservative-by-nature chemical manufacturers should recognize the benefits of low-cost, self-configuring IIoT sensors (Figure 1). “It means shedding light on the dark areas of the value chain without impacting the often-fragile legacy infrastructures traditionally responsible for generating, collecting and visualizing data. Similarly, modern cloud-based analytics have the power to deliver hugely important insights into processes that are often considered black arts,” Orchard notes.
Figure 1. Low-cost, self-configuring IIoT sensors enable getting data without impacting legacy systems. Source: Siemens Digital Industries Software.
Getting data from the sensor to an edge device for initial processing and, subsequently, to the cloud is a transparent exercise that modern IIoT platforms handle sublimely, he also stresses. In a chemical plant, the challenge is tilted toward gaining access to high volumes of existing data and merging them semantically with newly generated IIoT streams.
“Customers with a 2000s-era historian (e.g., PI from OSIsoft), or open DCS (e.g., from Siemens) will be able to leverage legacy-edge-to-cloud connectors that their suppliers have built in order to up-sell digital services — truly a win-win situation — but older, proprietary, non-networked systems will have to be opened and integrated before they can contribute to process optimization, predictive maintenance or other ML algorithms that modern cloud platforms have to offer,” he cautions.
Once the data are available, the “real” job of extracting actionable insights begins. “ML can reduce the effort of plowing through data for patterns, but real domain knowledge is needed to train, tune and maintain the algorithms. This is usually a great way to apply the most experienced operations staff,” Orchard adds.
Choosing a technology partner to gather and use sensor data requires finding a balance between agility and stability. “There are literally thousands of IIoT platforms to choose from — most of which will offer OPC UA connectivity and cool drag-and-drop features, and most of which will be gone from the market within two years.”
“I think that the availability of [the] IIoT, intelligent edge and fully integrated cloud already has opened a significant opportunity that chemical companies should be looking to leverage. Future improvements in terms of AI will be backwards compatible with this infrastructure. My advice is to connect what you’ve got, illuminate what’s dark, go to work on the [data] lake you create, and push as many of the insights back to the edge where they can be effective in real time,” says Orchard.
The area of most interest in the near future will be an emerging ability to inject the digital twins — predictive models of equipment and processes — into the edge where they can augment real sensor information, compare expected against real results and drive processes towards a safety and financial optimum, he believes.
Even so, Orchard urges process engineers to be aware of two important trends. First is the convergence between traditionally separate domains such as finance, process control, electrical engineering, mechanical engineering and more. “In the near future, it’ll be more important to understand the process than the specifics.”
Second is data science. The rise in ML and AI and the deep understanding of the process they will give will place increasing importance on grasping statistics, trends and patterns, if not the actual programming languages like R and Python that data scientists use. “Literacy in the basics will help the most experienced process engineers to extract the most value from these modern toolsets.”
Underappreciated Value
The chemical industry’s lag in adoption stems at least in part from a failure to appreciate the potential impact on processes, believes Keith Flynn, senior director, product management, Aspen Technology, Halifax, N.S. “Some industries prefer to wait and see how technology works in other fields first.”
Another important contributor to the lag is that chemical makers face many more situations where off-the-shelf sensors don’t meet the necessary requirements or standards, he notes. So, where IIoT sensors don’t yet exist, a plant must augment its standard sensors with soft sensors — inferential models that use available data to estimate process variables, for example, to close the gap on a mass balance. “In some situations, soft sensing will always be necessary due to access constraints. The use of soft sensors is not always dictated by infrastructure or cost, sometimes you just have no place to install a sensor,” he explains.
Then there is the question of which data to send to the cloud (Figure 2), says Flynn. “You have to identify the relevant data in different data sets such as workflows — and even subsets of that data may be enough. With bigger data sets, the challenge then becomes getting all the information onto the cloud and the sheer cost of the bandwidth involved.”
Figure 2. Deciding which data to send to the cloud requires careful evaluation. Source: AspenTech.
AspenTech offers applications designed to aid in the collection, conditioning and enrichment of such data as well as in identifying the “best” datasets, he notes.
The IIoT is bringing much broader advances than those provided by earlier technologies such as PLCs [programmable logic controllers] and expert systems, declares Flynn. “It’s also amplified by the large heterogenous list of players in this field that interoperate with each other and who all require their own training. For example, we see hardware suppliers of edge devices, edge software suppliers for connectivity, cloud providers, cloud apps and much more that can be put together into a solution requiring many different skillsets. Our goal is to simplify the approach into a full IIoT stack that helps accelerate the culture change and increase technology adoption rates.”
At the same time, on-premise data will remain relevant for the foreseeable future if the infrastructure supports the desired outcome, he believes. “This means that adequate instrumentation, networking, data storage and computing power must exist on site to do projects that involve AI and ML to solve problems. In some cases, there are hybrid options with some components on site and some in the cloud. However, for situations where advanced problems need to be solved using AI and ML and where there is no adequate infrastructure, [the] cloud may be the only option versus doing nothing.”
“As a field engineer with a DCS/PLC background originally, I think people have to embrace the IIoT and edge technology and understand it rather than react to it. When I first saw the IIoT, I realized that it would open doors. This doesn’t necessarily mean displacing existing services, rather changing the way that a service is carried out. It will maybe require retraining, courses in AI, etc. However, with AI, we will be able to reach more assets in the future and have more visibility,” he concludes.