AI: The Next Frontier of Performance in Industrial Processing Plants
AI is more accessible than ever. With a wealth of historical data and existing subject matter expertise, processing plants are well positioned to create new opportunities.
The successful application of AI across various industries has created a renewed focus on the robust economic value that AI can unlock. In fact, a recent McKinsey survey found that AI leaders outperformed their industry peers by a factor of 3.4. Globally, our estimates show that AI has the potential to deliver additional total economic activity of approximately $13 trillion by 2030, and approximately $1 trillion in value remains to be captured from the industrial sector.
Although AI adoption remains low in the industrial sector, value can be extracted today from existing infrastructure. According to our research, operators that have applied AI in industrial processing plants have reported a 10 to 15 percent increase in production and a 4 to 5 percent increase in EBITA.
Operators of industrial processing plants are particularly well positioned to capture the benefits of AI. Many already rely heavily on data-driven decision making using processing data (combined with commodity supply and demand) and pricing data. And most plants today have made significant investments in enablers of AI, such as network design, control systems, and historical data capture.
Looking forward, AI can help detect patterns and insights that are not readily apparent to humans, thus increasing processing plants’ productivity and competitive advantage. This article explains how AI tech and agile methodology can help organizations rapidly capture value as well as how adoption of AI can revitalize people and processes.
The limits of traditional process controls
A typical process plant uses sensors to collect thousands of process measurements such as flows, temperatures, pressures, and levels, which provide the information to the control logic for the various controls.
Yet most industry players do not have robust programs for managing the accuracy and reliability of critical process measurements. Maintenance requests are typically initiated when plant operators observe issues with measurements. However, many plants lack a systematic approach for maintaining the quality of critical measurements, in large part because they don’t have a reliable way to track which of the thousands of measurements are most critical for effective operations. Prioritization is based on general understanding and experience, but a lack of a systematic approach leaves gaps in coverage.
Feedback from plant sensors is then processed by three types of controls, each with different levels of robustness: basic single-variable control loops, advanced regulatory controls (ARCs), and advanced process controls (APCs).
Basic single-variable control loops
Single-variable control loops control a single process measurement to a given set point, such as a desired flow value, using a manipulated variable such as a control valve (Exhibit 1). A key limitation of such processes is that isolated loops don’t take the real-world interaction of other process variables into consideration. Furthermore, multiple single-variable control loops within the same system do not account for one another, which can lead to single-variable controllers “fighting” to achieve their own set points and negatively affecting other process variables being controlled by their single-variable controllers.
Exhibit 1
Advanced regulatory controls
To overcome the limitations of single-variable controls, some control schemes will integrate multiple single-variable controllers that use logic strategies such as feed-forward, ratio, cascade, and auctioneering control. Although the single loops still act on one input and one output, the overall scheme accounts for real-world interactions and attempts to mitigate the negative impact on critical process variables. The advantage of this approach is that it can be implemented directly in the control system without additional tools. The trade-off, however, is that many processes require complicated ARC control schemes, and as these schemes become larger and more complicated, they are increasingly exposed to instrument failures.
Advanced process controls
APCs use different fundamental control algorithms to regulate controllers (Exhibit 2). APCs are typically implemented above regulatory process controls and work toward achieving specific objective functions—such as higher efficiency, throughput, and quality—by balancing interactions between basic process control loops within a process unit. Many APCs create process models and control multiple process variables. Other APCs use a rules-based approach such as fuzzy logic to imitate the knowledge and actions of human operators. Either way, all APCs require ongoing maintenance to account for changing process conditions. At many sites, APC usage drops significantly over time and can reduce the impact of initial capital and engineering investments. And in some cases, less than 10 percent of implemented APCs remain active and maintained.
Exhibit 2
One of the main challenges in maintaining and improving existing regulatory and APC systems is their sheer size. With thousands of process measurements, control loops, and highly complex APC systems, many engineering initiatives are unable to systematically prioritize and focus on areas in which improvement would have the most impact. This is where AI can help unlock additional potential.
A new approach to managing process controls at the organization level
Some APC vendors have started integrating elements of advanced analytics (AA) and AI to improve the accuracy of their process models. However, AA and AI usage within APCs is limited to improving the APC models. Building AA and AI capabilities within the organization and applying them across a process plant at a larger scale represents a much larger opportunity.
AI solutions are the best option when they can help process huge amounts of process data and identify the most powerful opportunities. Although other computing systems rely on being programmed with explicit rules, AI systems can be used to discover relevant rules through supervised and unsupervised learning from large amounts of process data. Instead of subject matter experts identifying all the rules and relationships governing a process, AI can detect patterns and insights that are not easily visible to humans. Subject matter experts can then take these insights and integrate them into their operating recipes to improve performance. In this way, AI can help owners and operators optimize traditional process control and operations (Exhibit 3).
Exhibit 3
Optimizing traditional process controls through AI
AI can serve as a decision support system to help identify opportunities that may not be visible to humans in highly complex industrial processes, such as plant performance for various ore types. It can also help improve data quality and detect sensor malfunctions. For example, AI tech has been used to identify the following issues and opportunities, which would usually require experienced subject matter experts to identify:
- control schemes that are performing poorly and have otherwise not been identified
- previously unidentified interactions between process variables that have an impact on production but are not included in control schemes
- process measurements that have a large impact on production but are not included in APCs
- the modeling of relationships between process variables and production to assess the configuration of existing APCs
- new candidates for additional APCs that will have the most impact on production
As an example of underused controllers, a zinc smelter was not achieving the expected recovery in its fumer process. An investigation revealed a dual root cause: operators did not trust the controllers, and the hard operating limit at which the slag froze—1,100°C—led to manually operated temperature controls with a large buffer. Using AI helped to identify the root cause of past underperformance and identified process controls that needed improvement. This led to a high utilization rate of the improved process control, a reduction of 22°C in the average temperature, and a significant uptake in recovery.
In another example, a large ethylene player struggled to consistently reach production targets while keeping a high conversion rate in the process. Only a limited amount of process variables were being used, and a larger set of process variables highlighted a new set of nontrivial interactions.
Optimizing operations through AI
Along with improving process controls, AI technology enables players to identify and optimize operating recipes under different process conditions.
Operators rely on their experience and intuition to run plant processes. Many have deep expertise in operating their plants, but the element of human bias and the tendency to maintain the status quo have an impact on the potential for continuous improvement. AI technology allows the detection of additional patterns and insights that are not easily visible because of the highly complex and variable nature of chemical processes.
As an example, operators at a large open-pit copper mine believed there were only three ore types to process, one of which was by far the most prominent. As a result, they rarely changed the processing set points (or the “recipe”). However, machine-learning tools revealed that there were in fact seven distinct ore types, and the types feeding into the mill were changing far more frequently than the operators realized. This presented an opportunity to use sensors, analytics, and process controls to identify these changes in near-real time and to update the processing recipe to suit the ore types. Overall, the new recipe led to a production increase of more than 10 percent in less than six months.
Robust AI solutions can also help identify the optimal operating recipes under varying process conditions, such as across changes in the feed quality or in the desired product mix. These AI systems continue to learn and improve over time. AI can also help identify and document insights related to the intricacies of operating process plants. This helps to create consistent performance across operators and retains the expertise within the organization as older operators retire.
How to create competitive advantage using AI
Process plants that maximize the insights from their data by using AI solutions can improve performance and continuously adapt to changing market conditions. In industries with volatile margins and a multitude of supply chain and regulatory pressures, the organizations that make the best use of data and remain flexible will have a significant competitive advantage over their peers. For instance, the market sets prices for commodities that these plants produce, so the main levers to improve profitability are reducing costs and increasing efficiency. AI can help on both of those fronts by improving production using the same amounts of energy—not only increasing production but also reducing the cost per ton produced.
As discussed in a recent McKinsey publication, a successful adoption will work only through an efficient interplay of people, processes, and technology (Exhibit 4). Moving forward, three actions can help shift traditional thinking and enable value creation using AI: adopting AI to revitalize people and processes, enabling rapid value capture with AI tech and agile methodology, and accelerating AI adoption with generative AI.
Exhibit 4
Adopting AI to revitalize people and processes
Agile implementation of AI solutions can lead to a revamp of how traditional teams work in processing plants. Traditionally, upstream and downstream processes operate as islands. By making a clear connection between these processes and enabling performance discussions of how AI detected the interactions between processes, leaders can make huge improvements in the way plants operate. Additionally, planning, operations, and maintenance teams no longer need to work in silos with rigid, formal channels of communication. Instead, small multidisciplinary teams focused on specific goals can combine their insights to find quick and effective strategies for impact.
The agile approach also helps establish a culture of close collaboration and continuous improvement, in which teams work to capture additional value in weeks rather than over months and years. Quick wins achieved through the efficiencies that AI tools provide build excitement in the team and establish a norm of frequent uplifts in performance. Over time, these energized cultures help organizations stay at the forefront of their industry in terms of technological advancements and profitability.
Enabling rapid value capture with AI tech and agile methodology
Traditional waterfall project management required months or years before a solution was ready to use. Even today, a new APC implementation can take more than a year before impact can be observed and evaluated. Agile methodology focuses on rapid deployments and iterations to ensure that the team discovers the right criteria for success and meets them. This method has transformed businesses by allowing them to operate more efficiently and capture value in a matter of weeks rather than years.
AI solutions can be deployed in an agile manner, in which a proof of concept is quickly deployed to establish feasibility and potential impact. For example, at the aforementioned mine, the team built and tested data models in two-week-long sprints. Improvements were added to a backlog, which was then used to determine the tasks for subsequent sprints.
This and similar methods can help reduce the risks involved in undertaking huge capital-intensive projects without a clear picture of potential impact. Agile methodology is the dominant approach in the tech industry, where it has proved its impact, and it is now increasingly common in other industries.
Accelerating AI adoption with generative AI
Although both traditional and generative AI have individual value, operational benefits can multiply when the two are deployed in tandem. However, the complexity of insights and nonobvious relationships discovered by AI-based simulations or optimization algorithms can create resistance among front-line operators because of a lack of understanding of what can be perceived as a “black box.” On this point, generative AI can explain recommendations in easy-to-understand language, thus increasing confidence and boosting adoption. However, generative AI should be seen as a supplement to traditional change management—not an outright substitute.
This is an exciting time for process controls, and many new innovations are on the horizon. Intriguing work is being done in the fields of self-tuning process loops, drift and fault tolerant control, early anomaly detection, model-free reinforcement learning, AI-driven control strategy design, and learnings from broader information networks. The organizations that invest in building their AI capabilities will be best positioned to take advantage of these advancements.
Written by: Filipe Barbosa, senior partner in McKinsey’s Houston office; Kane Blay, consultant in the Perth office; Mike Doheny, senior partner in the Atlanta office; Usman Farooq, consultant in the Denver office; Milan Korbel, partner in the Melbourne office; Soenke Lehmitz, senior partner in the Stamford office, where Richard Sellschop and Dan Swan, senior partners; Andy Luse, partner in the Washington, DC, office; Lapo Mori, partner in the Denver office; and Xavier Morin, consultant in the Luxembourg office, for McKinsey & Company.
The authors wish to thank Gervasio Briceno, Sean Buckley, Shakeel Kalidas, Gerhard Nel, Avinash Tripathi, Asad Ul Haq, Eben Van Niekerk, and Charles Ying for their contributions to this article.