November 7, 2019 by Jonathan Gross
According to a 2019 World Economic Forum and McKinsey study, 70 per cent of manufacturers want to implement Industry 4.0 projects to diversify business models, increase market penetration and improve efficiencies.
However, according to that same study, only 15 per cent of manufacturers have a responsive strategy in place.
So, why the disconnect?
Because most don’t know where to start. The idea of using artificially intelligent and integrated systems to control your plant and warehouse sounds daunting and unapproachable, even scary. Who’s going to architect the solutions? Implement and maintain them? Secure them?
If you’ve read one of my previous articles on the subject, you’ll know that I advocate a deliberate approach that breaks down a seemingly impossible journey into manageable stages. First, build a strong digital twin base that’s adaptable to an unknown future state. Second, implement a proof-of-concept for an easy win. Third, take a bigger leap – one that’s capable of delivering significant value. Fourth, continuously enhance, optimize and improve.
Here’s a real-world example.
Achieving ROI with Industry 4.0
My firm works with a manufacturer that extrudes resins used in a variety of applications – from highly regulated aerospace products to sporting equipment.
At its primary production facility, the company was wasting $350,000 annually because of poor product quality. The losses included direct costs of excessive scrap and indirect costs of suboptimal customer service. The company was routinely re-running work orders, causing it to juggle production schedules and miss its promised delivery dates. Customers weren’t happy.
So, we helped our client architect an environment that would allow it to detect and react to quality issues much more quickly. We designed a program intended to reduce unplanned scrap by 80 per cent and improve its perfect order rate KPI to 90 per cent (a weighted formula that accounts for quality, lead times and fill rates). And, since no company is perfect, we designed a contingency process that provided customer service personnel with systematized warnings prompting them to proactively notify customers as soon as unanticipated issues are detected.
An integrated edge, fog and cloud computing architecture
The technology environment involved implementing and interfacing various information technologies (IT) and operational technologies (OT) that include enterprise resource planning (ERP), a warehouse management system (WMS), manufacturing execution system (MES), laboratory information management system (LIMS), programmable logic controllers (PLCs), production equipment and artificial intelligence (AI).
The solution was designed using a three-tiered architecture that includes edge computing, fog computing and cloud computing. What differentiates these tiers is the proximity of computing to the data source and whether that computing is centralized.
What differentiates these tiers is the proximity of computing to the data source and whether that computing is centralized.
Edge computing pushes computing application, data and services to the logical extremes of a network and away from centralized nodes. A company decides to implement edge computing when it has a need for extremely low latency, where there are high costs to transfer the data to the cloud, where connectivity is an issue, or where compliance demands local processing.
Fog computing is a superset of edge computing that bridges the continuum between cloud and edge computing. There’s still a need for data-dense, low-latency processing, but there’s also a need to compute across multiple edge solutions.
Cloud computing is at the end of the spectrum opposite edge, where broadly sourced data is centrally processed. Companies move computing to the cloud when they want centralized computing horsepower that might otherwise be too expensive or complex to set up and manage themselves.
Implementing the three-tiered architecture
The edge-fog-cloud solution proved to a perfect fit for our resin manufacturing client.
At the edge, the company leveraged its existing investments in modern manufacturing equipment control systems.
The architecture included a new fog computing tier that allowed our client to realize big-time Industry 4.0 benefits by automating, interfacing and systematizing what were previously manual, inefficient and costly processes.
Previously, shortly after the start of every work order, an operator would run a product sample to the lab to determine whether the extruders were properly set up to produce product of acceptable tensile strength, melt point, colour and a host of other quality attributes. If the results were outside of customer specifications, machine operators would manually adjust various process controls to get the product within acceptable tolerances.
We designed a system that closed the loop among the LIMS, MES and the equipment control systems using the Internet of Things (IoT) and application programming interfaces (APIs). When lab quality results are posted, the results are capable of automatically triggering changes to equipment process controls to yield product at appropriate quality standards – updating parameters such as flow rates, temperatures and screw RPMs.
The purpose of this fog computing solution is to use the powerful systems for what they’re good at: quick, powerful data analysis and efficient process execution. The automation is far more efficient and scientific than the previous human operator “thumb-in-the-wind” process of guessing how much to adjust the process controls. ERP, machine learning, and business intelligence (BI) operated in the cloud.
ERP would manage all standard back and front office functions, master scheduling and MRP. Bi-directional interfaces were designed to release warehouse and production orders from ERP to the warehouse, laboratory and manufacturing systems. Those systems would close the loop by reporting actuals back to ERP for inventory, costing, financial accounting, customer service, supply chain and other purposes.
Business intelligence was structured to sit atop a data lake into which data from multiple sources would flow.
This three-tiered solution isn’t our client’s end game. Rather, it’s the foundation for its Industry 4.0 program. The company has positioned itself to take advantage of machine learning innovations that its technology vendors are routinely releasing. These innovations will ultimately drive further improvements to product quality, process efficiency and equipment performance.
Industry 4.0 is more approachable than you think
If you’re among the 15 per cent of manufacturers that doesn’t yet have an Industry 4.0 strategy in place, don’t let the above example scare you. The process isn’t as complex as it may seem. The various technology systems are all commercially available – from control systems, ERP, LIMS and AI.
If anywhere, the challenge lies in properly architecting, acquiring, planning and implementing the solutions. Though even here, the barriers aren’t insurmountable. Increasing numbers of companies are pursuing Industry 4.0 programs, which means that you’ll be able to find people with the right expertise to fill whatever gaps you have.
Jonathan Gross is the managing director at Pemeco Consulting. He helps his clients architect and implement technology environments that integrate ERP with the edge.
This article originally appeared in the November/December 2019 issue of Manufacturing AUTOMATION. Read the digital edition.