Operationalizing Analytical Insights
The key to dealing with disruption is agility. The key to agility is having a strategic approach to harnessing data flows in real time using today’s digital tools.
Agility is essential for manufacturers to deliver a solution that suits consumers and brands. Indeed, Industry 4.0 describes an agile manufacturing ecosystem that works with a lot size of one, a digital thread that defines the entire supply chain and manufacturing process.
The numerous disruptions of the last five years (trade war, pandemic, components shortages, geopolitics, etc.) have proven just how important it is that the industry achieve this agility. And while many companies, particularly those in Electronic Manufacturing Services ( EMS ), have talked a good game, few have built agility into their business or operational strategy.
In fact, the opposite could be true. Over decades, the EMS industry sought to drive economies through scale and an increasing dependence on low-cost labor, all while moving inventory out of the supply chain. This has created rigid manufacturing footprints that thrive on little change and have little ability to adapt at speed. These just-in-time supply chains have proved brittle, not allowing for the just-in-case scenarios we’ve seen recently.
We’ve also had a decade of talking the talk around Industry 4.0, while very few are walking the walk! In short, few have seen a measurable digital dividend as yet. We need to use digital tools to create a more agile work environment that is less dependent on labor and hence more efficient, adaptable, and reliable.
What has that got to do with operational analytics? A lot, it turns out. The role of operational analytics is to gain intelligence that drives insights, leading to better and faster decisions. And that eventually drives better outcomes, which means better quality and reliability, greater efficiency and profitability, and a more robust and agile operational model. The process is simple data-insight-value.
Data Everywhere
One thing we have done well in the first decade of the fourth industrial revolution is to figure out how to connect machines and harvest the massive amounts of data available — now some might say we now have too much data.
If our process starts with data, it ends with value:
- Data (contextualized) produces intelligence
- Intelligence drives insights
- Insight drives value through better decision making
An example might be the closed loop between an SPI (Solder Paste Inspection) machine and a solder paste printer on a typical SMT line. In the past, many errors occurred because of the print quality of the solder paste printer. These would result in poor quality after reflow, low first-pass yield, and inefficiencies. Introducing SPI to the process allowed us to stop the line if there was an issue, but it only acted as a stop signal.
By creating a closed loop, the SPI can use images of the board with the printed solder paste (data) to determine if the right amount of solder paste is present in the right place (insight). It then uses that insight to decide how much to adjust the printer (speed, alignment, and pressure) to get the best result. Hence reducing waste and increasing yield and reliability to drive incremental value.
In this case, we know the outcome or value we are trying to achieve and can work our way back to understand the dataset we need to gain insight and make a good decision promptly. We could get numerous parameters from the SMT line; some deliver value, and some may not. Hence the first phase of any operational analytics strategy is to understand what data we need and how to use it to deliver value.
The quality of the data is also critical in this example. Not all SPI systems are created equal, and if the image is not accurate or accurately processed, errors can slip through, and false calls can slow the entire process.
Analyzing Data With AI Just Got Easier
Like Industry 4.0, we’ve been talking up artificial intelligence (AI) for some time. Right now, everyone’s being dazzled, and occasionally disappointed, by the skills of OpenAI’s ChatGPT and other chatbots from Google and Microsoft. Undoubtedly, AI has a massive role to play in our future, whether doing our children’s homework or figuring out how to optimize a factory — or even an entire manufacturing ecosystem.
What these AI systems show us, often vividly, is the importance of the learning derived from datasets. If you use unreliable data, you’ll get unpredictable insights, driving flawed decisions.
Let’s return to the example of the inspection system used to adjust the line. AI could be used to manage the enormous amount of data being derived from the system, but we need to be especially careful in what datasets we use to train that AI. Hence, we need to ensure those developing these systems have the domain knowledge associated with the manufacturing system and the deep domain knowledge required to understand what is good data and what is not. We must also ensure we use the best possible inspection solution with the highest definition and most accurate image.
There is no doubt that AI will be a game-changer for the use of data, particularly on the factory floor. Like many, we are working hard in this area. The factory floor can give us hundreds of signals at any moment. AI will help us process, prioritize, and manage those signals to generate better insight, outcomes, and value. We are on a fast ramp in the performance of AI and its application, but we will need to be careful about how we train our AI systems and how we monitor and manage their performance.
Data Management Best Practices
It’s worth thinking about the best practices for the management of data. With the volume of data generated in a factory, it is easy to see how data volumes can quickly become unwieldy and expensive in terms of storage. Here are a few best practices and things to consider in terms of data management:
- Backup and recovery — it is essential to have a regular backup plan and backups in multiple locations to ensure that if and when a data breach or failure occurs, recovery can happen quickly and seamlessly.
- Data locations — consider if you plan to store data on premise, in the cloud, or perhaps both. Within this decision will be considerations around data security and access. Multiple locations should add protection against loss but will also add cost.
- Security and access — encryption is vital as much of the data stored may be confidential or include your or your customer’s IP. The same is true concerning access. Ensure that access is restricted to those needing the data, and that they have the appropriate security clearance. As a rule of thumb, any data being transmitted should be encrypted. Part of any data management security system is the management of access.
- Data management — using the correct tools and systems can help optimize the data storage needed while creating more efficient workflows.
- Compliance and regulation — over the last decade, various laws and regulations have been implemented to protect privacy and ensure data is properly collected, stored, and shared. Ensure you are current and compliant with the rules and legislation of the regions where you operate and store or transfer data.
- Stay up to date — regular audits of your data process should allow you to stay on top of what is happening regarding technology and regulation, especially that which is specific to your data. It’s essential to know what data is being accessed and used and what is accumulating without providing insight.
Whose Data Is It Anyway?
Now that we can use big data, we must consider in whose hands the power should be and who needs which data. In a typical brand/EMS relationship, the data required by the brand will differ from what the contract manufacturer needs. Typically, product data affecting traceability, reliability, recalls, and supply chain transparency is necessary for the brand. On the other hand, data concerning manufacturing performance is more normally leveraged by the manufacturing company. But sometimes, these lines are blurred.
Hence, sharing data in an open and safe environment is important. Trusted data must be available to drive custom dashboards, reports, and notifications for every stakeholder. In some cases, data access must be gated so only those needing sensitive information can view it. Lastly, the ability to drill down into each data field can be extremely valuable in helping you to understand an issue’s root cause and find the right solution.
Five years ago, everyone was talking about the glass factory concept, where customers and the operational team could see exactly what was happening on each line and for each product. Now, thanks to recent component shortages and supply chain disruptions, people are more excited about the idea of a glass pipeline, which provides real-time transparency into each part of the supply chain.
Focus on the End Game
The bottom line is that data needs to serve the business’s strategy rather than the other way around! If we can design and plan the outcomes we need, such as a more efficient and sustainable supply chain and manufacturing ecosystem, we can map them to the data we need to collect to drive those outcomes.
And if we take a more open-minded approach in that design, we can create analytics that are as adaptable and agile as our businesses need to be.
Written by: Adam Montoya is VP of Industrial Solutions at Bright Machines, for the Manufacturing Leadership Council.