The Fourth Industrial Revolution Emerges from AI and the Internet of Things
Big data, analytics, and machine learning are starting to feel like anonymous business words, but they're not just overused abstract concepts—those buzzwords represent huge changes in much of the technology we deal with in our daily lives. Some of those changes have been for the better, making our interaction with machines and information more natural and more powerful. Others have helped companies tap into consumers' relationships, behaviors, locations, and innermost thoughts in powerful and often disturbing ways; and the technologies have left a mark on everything from our highways to our homes.
It's no surprise that the concept of "information about everything" is being aggressively applied to manufacturing contexts. Just as they transformed consumer goods, smart, cheap, sensor-laden devices paired with powerful analytics and algorithms have been changing the industrial world as well over the past decade. The "Internet of Things" has arrived on the factory floor with all the force of a giant electronic Kool-Aid Man exploding through a cinderblock wall.
It's a robot! It's stealing my job! (Actually, it's doing carbon fiber layup, which is exactly the kind of time consuming task that we want robots to be doing.)
The brains of a wind turbine, pictured here, contain more industrial sensors than you can shake a stick at.
Ars' Lee Hutchinson stands in front of the creel cabinet that feeds carbon fiber to the robot that took all of our carbon fiber layup jobs.
Doing something with that data to predict and prevent system failures has gotten increasingly important. As explained by MathWorks' Industry Manager Philipp Wallner, the mounting urgency is due to "[T]he growing complexity that we're seeing with electronic components in assets and devices, and the growing amount of software in them." And as industrial systems provide more data about their operations on the plant floor or in the field, that data needs to be processed to be useful to the operator—not just for predicting when maintenance needs to occur, but to optimize the way equipment is operated.
An airplane being assembled at an Airbus facility. The company is developing "smart tools" that use local and network intelligence as part of its own Industry 4.0 "factory of the future" initiative.
Predictive maintenance systems—such as IBM's Maximo, General Electric's Predix, and MATLAB Predictive Maintenance Toolbox—are an attempt to harness machine learning and simulation models to make that level of smartness possible. "Predictive maintenance is the leading application in making use of that data in the field," Wallner said, "especially in areas where components are really costly, such as wind energy. For equipment operators it's a no brainer."
It's a harder sell to equipment manufacturers, in some cases—especially because implementing the concept often involves providing detailed (and therefore proprietary and deeply guarded) modeling data for their products. And some equipment manufacturers might see predictive maintenance as a threat to their high-margin sales and maintenance business. However, some companies have already begun building their own lines of businesses based on predictive maintenance—such as General Electric.
GE first used Predix for internal purposes, such as planning maintenance of its fleet of jet engines—using "data lakes" of engine telemetry readings to help determine when to schedule aircraft for maintenance to minimize its impact on GE's customers. Using a library of data for each piece of supported equipment and a stream of sensor data, GE Software's data scientists built models—"digital twins" of the systems themselves—that can be used to detect early signs of part wear before things progress to part failure.
But GE has also applied the same technique to other, less mechanical inputs—including using models for weather and tree growth data to predict when trees might become a threat to Quebec Hydro's power lines. And GE has expanded the role of Predix into the energy market, modeling power plant output and other factors to give energy traders a tool to help them make financial decisions. Predictive systems are also already having an impact on logistics—for example, at Amazon, which uses predictive models to power Amazon Prime's pre-staging of products closer to potential purchasers.
There are other approaches to prognostication, some of which bleed into managing the overall operation of the plant itself. IBM's Maximo APM, for example—based on IBM's Watson IoT platform—builds its baseline from sensors and other data from equipment on the factory floor to continuously refine its algorithms for maintenance. Another Maximo package focuses on overall plant operations, identifying process bottlenecks and other issues that could drive up operation costs. (L'Oreal has had success implementing Maximo and the Watson IoT platform as part of its own Industry 4.0 effort.)
BRIDGING THE GAP BETWEEN DATA AND KNOWLEDGE
But there are several challenges that companies face in making predictive systems effective—the old computing proverb of "garbage in, garbage out" definitely still applies. MathWorks' Wallner noted that the main challenge is bridging the gap between the two knowledge domains needed to make predictive maintenance work. "How do you really enable the domain experts to work closely with the data scientists, or have one person do both? That's quite often the tension," Wallner explained. "You have two silos of knowledge, with one group having the pure data scientists and the other having domain experts with knowledge of the equipment they build, not talking to each other." The tools to create the models needed for operation must facilitate collaboration between those two camps, he said.
Even when there's good collaboration, there's another problem for many predictive models: while there's plenty of data available, most of it is about normal operations rather than failures (which is how it should be—a smoothly running plant shouldn't be suffering a lot of failures). "Often there's not enough failure data to train algorithms," Wallner said. "How do you train algorithms that need lots of data with a lack of failure data?"
A time-sensitive networking switch used in an industrial control traffic network.
In some cases, manufacturers perform "run to fail" tests to collect data about how their equipment acts as components start to push outside of their normal operating parameters. But "run to fail" tests involve creating failures, and purposefully breaking costly and complicated manufacturing hardware is uncommon. "You don't want to run a scenario where you break your wind turbine," Wallner explained. "It's too expensive and dangerous." In these cases, the manufacturers' domain experts may have already built simulation models to test such conditions computationally—and those models can be incorporated into predictive maintenance systems with a bit of adaptation.
The last gap to be bridged is how and where to process device data. In some cases, for safety or speed of response, the data from equipment needs to be analyzed very close to the industrial equipment itself—even having algorithms run on the embedded processor or procedural logic controller (PLC) that drives the machine. Other parts of analysis that are real-time but not directly safety-oriented might run on hardware nearby. But more long-term predictive analysis usually requires a lot of computing power and access to lots of other supporting data, and this usually means complex applications running in a company's datacenter or an industrial cloud computing system. Both GE's and IBM's predictive systems run in the cloud, while MathWorks' algorithms can be run locally or in other clouds (including GE's Predix cloud).
In some cases, companies may run combinations of all the above methods or start off with "edge" systems handling predictions until they're more comfortable with using cloud solutions. "It makes sense to have some of the algorithm as close as possible to the equipment, to do things like data filtering," explained Wallner, "but have the predictive algorithm in the cloud." This gets you the best of all worlds.
THE DANGERS OF DIGITIZING
While there is vast potential in the combination of information technology and operational technology that makes Industry 4.0 concepts like predictive maintenance possible, realizing that potential doesn't come without risks—especially if proper security measures aren't taken. While there have been few credible cyber-threats to industrial systems, new threats are emerging—including the "Triton" malware attacks that aimed to disable safety systems at multiple industrial sites and the "Black Energy" cyber-attacks in Ukraine that briefly took portions of the power grid down.
Predictive modeling systems pose a lesser risk than those having direct control over equipment, but there's still reason for concern about potential access to raw analytics data from the factory floor. Such data won't immediately yield the blueprints for proprietary manufacturing parts, but if it's subject to "big data" analytics techniques it might give an adversary (or a competitor) a wealth of information about the patterns of manufacturing operations, plant efficiency, and manufacturing process details that could be used for other purposes—including outright industrial espionage. Officials from German Ministry of Education and Research noted in the ministry's industry 4.0 report that "The most prevalent concern, especially among [subject matter experts], is that Industry 4.0's data is not secure, business secrets are lost, and carefully guarded companies' knowledge is revealed to the competition."
There are much greater threats, however, that could come from mixing operational technology with traditional IT, especially as autonomous systems are connected to existing industrial networks. Ransomware and other destructive malware could bring down control networks, as it did in Baltimore when a ransomware attack destroyed data from autonomous red light and speed camera sensors and shut down the CityWatch camera network. There's also the threat that controls themselves could eventually be targeted and manipulated, subverted, or sabotaged.
Much of what has protected operational technology from attacks thus far has been "security through obscurity." Industrial control protocols vary widely across equipment manufacturers, but blending the Internet of Things and other information technology with operational tech will require a great deal more attention to security—especially in applications where there's a threat to human lives. A malicious attack on safety systems could have "cyberphysical" ramifications beyond lost productivity or broken equipment in chemical, energy, and other industries where a failure could put the public at risk.
GE and others have tried to protect networks by isolating control systems from sensor data networks and by placing firewalls in front of older systems to block unwanted network traffic. Industrial cloud computing is generally partitioned from the Internet by virtual private networks and other measures. But before industries hand over more jobs to autonomous software and hardware robots, a full assessment of the security for data and commands flowing to and from them is probably a good idea.
Written by: Sean Gallagher, IT editor and National Security Editor at Ars Technica, for Ars Technica.