How AI and Machine Learning Are Transforming Plant Maintenance and Operations
Key Highlights
- The Data Challenge 84% of companies plan to increase maintenance spending, but many lack the historical data needed for traditional AI models—making synthetic data generation crucial.
- Hybrid Intelligence Approach Siemens combines physics-based models with machine learning to create reduced-order models (ROMs) that deliver fast, accurate simulation insights during operations.
- AI-Human Collaboration Co-pilot-style AI assistants are enabling plant operators to run complex simulations and make faster decisions without requiring deep technical expertise.
The promise of artificial intelligence in industrial operations has long been discussed, but for many in process industries, the practical application remains unclear. How do you extract real value from AI when your plant lacks sufficient historical data? What happens when the cost of building complex models outweighs their operational benefit? These aren't theoretical questions—they're daily challenges facing maintenance teams and plant operators trying to reduce costs while improving reliability.
According to Verdantix's 2025 global corporate survey, 84% of companies are planning to raise maintenance spend, with a majority looking to production optimization and predictive maintenance as solutions.
This rising interest is reflected in the experiences of major vendors and their customers in the chemical industry, but there are some caveats.
Ravindra Aglave, Director of Energy & Process Industries at Siemens Digital Industries Software, sets the AI scene.
“The basis of AI is in machine learning. Whenever one speaks of learning, the fundamental question is what is it that one is trying to learn and from whom? These are the fundamental aspects that define the utility of AI,” he said.
There are three considerations. First, when you have a lot of information or historical data, you can learn from it and use those insights to create value. “Mining data with specific goals, searches, etc., are easy and obvious tasks for AI,” Aglave added.
If such information and data don’t exist – often because of cost and time constraints – it first needs to be generated, possibly using physics-based first-principal models and equations. If, however, it does exist, it needs to be mined, prepared, and contextualized to generate intelligence.
In fact, this combination of generating synthetic data to learn from physics-based models, providing tools that contextualize existing data and utilizing existing data to build intelligence to be deployed during operations is the driving force behind Siemens’ AI efforts.
One development that Aglave is particularly excited about is its use in building operation-ready reduced-order models (ROMs). ROMs are used to simplify high-fidelity, complex simulations by incorporating essential features into other, more simplified models that run on standard hardware. These, in turn, enable users to have broader access to simulation insights and make faster, real-time decisions (Figure 1).
“ROMs consolidate physics-based and data-based models that can be deployed as executable digital twins,” Aglave adds.
Another development that excites him is the eventual addition of co-pilot-like assistants to all software tools from Siemens.
Five Emerging Trends of Generative AI
The latest issue of Tech Trends 2030, one of the company’s foresight series of reports, focuses on the next era of generative AI. Five emerging trends are singled out for mention.
Agentic AI, the use of AI systems that possess a certain level of autonomy and decision-making capabilities in the industrial context, is cited for its potential to increase efficiency, reduce costs, improve safety and enhance decision-making capabilities.
Multimodal large language models (LLMs) could revolutionize the processing and generation of industry-specific data, such as time series, 2D and 3D models or machine vision in the same way that conventional LLMs have revolutionized text and speech processing.
Industrial Edge architectures address latency and bandwidth limitations, enabling industrial systems to respond promptly to critical events and make timely decisions using real-time data analysis and automation, and so reducing cloud-based processing and associated cyber threats.
Specialized hardware, such as graphics processing units (GPUs) or language processing units (LPUs), enables edge devices to support parallel processing and achieve accelerated performance, resulting in faster execution of complex AI tasks.
However, the trend that Siemens singles out as potentially the most important is Industrial Foundation Models (IMFs). Pre-trained on industry-specific data, an IMF enables faster and more accurate deployment of AI solutions. It is trained on the “language of engineering” and surpasses typical LLMs. It supports not only text and images but also 3D models, 2D drawings, and other complex structures, such as time-series data specific to the industry.
Siemens believes that having a standardized starting point saves time, resources and energy. “The models capture industry complexities, leading to informed decision-making. They also facilitate knowledge transfer and collaboration across sectors,” the report noted.
AI utilization: Top Tips from Siemens
- Data is essential, but must always be backed up by physics-based understanding.
- AI is task- and knowledge-specific. A single tool or model cannot be applied beyond the scope for which it was created.
- AI is not a panacea. Understand the quality and infrastructure of your data: often, quality depends on what you have forgotten or cannot measure.
- Building a foolproof model is critical, but AI models can only learn from what is already known. Consider using a physics-based complement to help with conditions and situations that the AI has never encountered before.
- Build and cultivate strategic partnerships with companies that have domain knowledge of processes.
- Utilize digital twins to deploy AI models, thereby monetizing investments and integrating both automation hardware and operations software throughout this process.
- AI utilization should always be looked upon as part of a broader digitalization strategy.
AI as a Dynamic Partner
Like Siemens, Emerson Aspen Technology also contributed to the Verdantix report.
“AI is enhancing simulation in various ways, like closing the simulation-reality gap by enriching first principles models with data through hybrid models. The user experience of simulation is also changing, making AI more of a dynamic partner,” explained Dr. Heiko Claussen, chief technologist at Emerson’s Aspen Technology business.
A good example of this is when engineers create or improve plant designs. “They are faced with many options and variables, making plant design an overwhelming and time-consuming process,” he said.
This is where generative AI comes in. In Emerson’s case, this is reflected in the enhancements added to Aspen Optiplant 3D Layout in May. These include three automated plant designs, each based on early project data such as equipment lists, for faster and broader evaluation of options. “This allows for trade-off assessments based on non-functional requirements and accelerated time to value. What once took days or weeks can now be done in minutes,” said Claussen (Figure 2).
In fact, many Emerson customers already are using hybrid models for their simulations.
Claussen cites the example of SOCAR, the State Oil Company of Azerbaijan. The company is involved in chemicals production through its various downstream operations, producing petrochemicals such as ethylene, propylene, and other monomers that serve as building blocks for plastics, as well as polymers like polypropylene (PP) and polyethylene (PE). Other key chemical products include methanol and urea fertilizers.
Emerson’s involvement focused on creating an end-to-end model of the company’s acrylonitrile (ACN) process, including its operating limits and equipment design. SOCAR utilized this approach to evaluate variables simultaneously and determine the optimal way to operate the facility, thereby maintaining production while optimizing energy efficiency.
“SOCAR achieved 36% increased waste heat recovery by optimizing ACN production,” Claussen noted (Figure 3).
Simulation is also enabling autonomous AI of the future, he adds, with the advent of LLMs demonstrating the power of AI to parameterize solutions and predict options automatically.
“However, the proposed results are not always reliable out of the box, which is not acceptable for many industrial applications. Simulation models can be used as guardrails to ensure AI-based decisions are safe, thus enabling a new generation of automated processes,” he cautioned.
On the other hand, significant progress has been made in seamlessly integrating AI capabilities into simulation systems. What used to be hard-to-maintain, custom projects for each plant are now made scalable by simulation solutions. For example, building a hybrid model from both first principles and field data can be natively enabled through workflows that target chemical engineers.
“The key,” he said, “is to get started and refine the approach as progress is made. This will help overcome potential AI-related uncertainty in the workforce and move the organization to a path of continuous learning.”
And for those thinking of implementing AI, he has this advice: it’s all about recognizing the importance of data.
“When implementing AI, it is important to recognize the importance of data. While data from the field is not required for all AI use cases, it is crucial for fine-tuning models and making predictions. Many of today's OT systems remain fragmented, resulting in data silos, high maintenance costs and cybersecurity risks. An OT data fabric overcomes these challenges and securely feeds the desired AI use cases with in-context data from throughout the organization.”
Autonomous Control Applications
According to Stephen Reynolds, industry principal of chemicals at AVEVA, AI is becoming increasingly central to simulation, with a strong focus on blending the two with AI-and-simulation synergy. Here, he notes, AI models serve as surrogates for first-principles simulations, thereby accelerating scenario exploration.
For example, Reynolds cited an ongoing project utilizing NVIDIA’s Raptor deep reinforcement learning (DRL) engine in conjunction with AVEVA’s own Dynamic Simulation engine. The idea is to train agentic AI/DRL agents to handle transient events and work toward autonomous control applications across chemical, energy and power contexts — pointing toward industrial autonomy and closed-loop control and decision-making support.
On the R&D front, the company plans to increase its workforce at its Indian R&D centers by 5% as it expands AI model development. “The emphasis here is on agentic AI, predictive capabilities and generative AI applications in collaboration with partners like Microsoft and Databricks,” noted Reynolds.
AVEVA’s 2024 roadmap commits it to focusing on advancing industrial AI assistant capabilities, embedding AI into Operations Control, PI Data Infrastructure and visualization and digital twin platforms. “This will enable operations and analysts to query AI assistants, produce dashboards, optimize throughput, quality and sustainability metrics,” explained Reynolds.
In February, AVEVA signed an MoU with Indian refining and petrochemical company HPCL Mittal Energy Limited (HMEL), Mumbai, to advance digital transformation in its operations.
HMEL plans to implement a comprehensive package of software solutions, including AVEVA’s cloud-based industrial platform, digital twins, advanced analytics, and AI-driven technologies, to support the Indian government’s plan to expand the country’s refining capacity from 250 million metric tons per year (MMTPA) currently to 450 MMTPA by 2030.
The project will focus on HMEL’s Guru Gobind Singh refinery in Bathinda, Punjab, which operates an 11.3 MMTPA crude oil refinery, a 1.2 MMTPA polyethylene plant and a 1 MMTPA polypropylene plant.
In particular, the MoU cites collaboration on initiatives such as the development of a Centre of Excellence and a next-generation Refinery Command Center, leveraging technologies including real-time operational intelligence, process optimization, predictive maintenance and supply chain enhancement.
“Simulation is very likely a core element of the AVEVA–HMEL project. All the technologies mentioned in the MoU naturally tie into simulation and digital twin models to drive decision-making. The planned Refinery Command Center and Center of Excellence further suggest a strong role for modeling, monitoring and analytics,” said Reynolds.
“In line with typical AVEVA deployments in refineries and petrochemicals, both steady-state and dynamic modelling simulations are likely embedded within the digital twin and optimization layers, supporting operational monitoring as well as proactive scenario analysis,” he added.
As another industrial simulation example, Reynolds points to a 2024 project with ISU Chemicals, Seoul, South Korea.
The companies conducted a case study focusing on the existing post-combustion carbon capture unit (CCU) that utilized mono-ethanol-amine (MEA) technology.
The goal here was to deliver a robust and fast ML model for the CCU process that can be appended to other models, generating compatible flue gas streams. The main problems to overcome were the very high energy consumption – provided by steam – and a slow APS model.
In particular, the project focused on optimizing reactor performance and establishing a catalyst replacement plan, building feed and product component structure from sample assay data, and deploying a hybrid simulation of AVEVA Process Simulation and an AI reactor model to predict reactor yield and catalyst performance and decay.
The results included a 99.7% accurate prediction of reactor yield based on different recipes and operational environments, improved catalyst performance predictions to enable an efficient replacement strategy, and allowed engineers and operators to proactively simulate the plant through an external HMI built in Excel.
AI and machine learning are emerging as essential tools rather than experimental technologies. The path forward isn't about replacing physics-based models with data-driven approaches, but strategically combining both to create more efficient, operation-ready solutions. The question is no longer whether to adopt AI, but how quickly organizations can integrate these tools into their existing workflows.
About the Author
Seán Ottewell
Editor-at-Large
Seán Crevan Ottewell is Chemical Processing's Editor-at-Large. Seán earned his bachelor's of science degree in biochemistry at the University of Warwick and his master's in radiation biochemistry at the University of London. He served as Science Officer with the UK Department of Environment’s Chernobyl Monitoring Unit’s Food Science Radiation Unit, London. His editorial background includes assistant editor, news editor and then editor of The Chemical Engineer, the Institution of Chemical Engineers’ twice monthly technical journal. Prior to joining Chemical Processing in 2012 he was editor of European Chemical Engineer, European Process Engineer, International Power Engineer, and European Laboratory Scientist, with Setform Limited, London.
He is based in East Mayo, Republic of Ireland, where he and his wife Suzi (a maths, biology and chemistry teacher) host guests from all over the world at their holiday cottage in East Mayo.




