Communications & Networks
How much big data do you need?
Seven questions for manufacturers to diagnose their need for data-driven analytics
May 24, 2019 by Jeff McBee
May 24, 2019 – I sit down for discussions all the time with managers and executives who question how Industry 4.0 is relevant to their operations.
They are busy, budget conscious and understandably reluctant to take on the risk of adopting new technologies.
The good news for them is that the first step toward Industry 4.0 doesn’t have to be a giant leap. Industry 4.0, at its core, is about data. Not just collecting data – that’s old news – but organizing and analyzing it in a timely and integrated manner to improve efficiency, quality and time to market.
Of all the datasets available to a manufacturer, one that offers incredible value but is still often overlooked and underused is final assembly part data. This encompasses all the data from each sub-assembly test and process as the part/assembly moves along the assembly line.
Making more effective use of this data with modern collection and analytics tools can have a big impact on a manufacturer’s operations. And it doesn’t require a high-risk or high-cost approach.
Here are seven questions to consider that can help your team decide if it is time to take the first step.
1. Is your production line serialized?
If the answer is yes, half the battle is won already. If a line is producing parts that are serialized, it’s easy to create a “birth history record” for each part, indexed by its serial number. All data from every process and test that touches that part can then be collected into this single record for easy retrieval and analysis. Modern data collection and analytics can also be applied to batch manufacturing.
2. Are customers pushing for improved traceability?
This question is most relevant to manufacturers that operate as part of the supply chain of a larger OEM. Clearly, the time for change is at hand if customers are beginning to ask for quality and compliance assurances that the status quo simply cannot deliver. Accountability and transparency are both hallmarks of Industry 4.0.
Was a product built to spec – and if so, where is the data trail to show it? Were there any quality issues that might raise a red flag – again, where is the data trail to prove there wasn’t? If a defect has been identified, can your team triage the situation and trace the parts and final products that are suspect?
In-depth analysis of parts data is key to address these questions. An OEM’s reputation is defined by the market’s perceptions about the quality, safety and reliability of its products. No supplier wants to be considered the weak link in that value chain because it lacks the data capability to deliver an acceptable level of quality assurance.
3. Are quality issues eroding profitability?
Maybe it’s high scrap and rework rates due to failures at test stations along the line. Or an issue that doesn’t show up until an end-of-line test where the only recourse is a costly teardown and a production halt. Maybe that trailer full of product about to pull out of the loading dock should be held back. Then there are the warranty issues that raise the need for a recall.
Wherever the quality issue arises, it is going to cost you time and money. Your reputation may take a bruising, too. Suppliers can no longer hide behind their OEMs in a recall situation. When something like a car gets recalled due to a faulty airbag or steering system, it’s the supplier of those components that ends up in the media spotlight having to account for itself.
Having the data analytics capability to catch quality issues as soon as possible, as close to their point of origin as possible, is crucial. If current methods of quality assurance and inspection are failing to catch defects before they have impact on the bottom line, it is time for a change.
4. Are there process and test stations that are persistently problematic?
Every production line has trouble spots that give operators and quality engineers grief. It could be a finicky leak test, a poor-performing press-fit operation, difficulty connecting the dots to trace a leak failure due to a flaw with an upstream process, or a need to reduce cycle times to boost output. Additional challenges come from model variance, when a production line has to produce different models of the same product and maintain the same standard of quality across all of them.
Modern data collection and analytics are key to gain the timely insight that is needed to manage all these scenarios, to improve and maintain quality, efficiency and yield.
5. Are data silos making it difficult for you to use your data in this way?
Few manufacturers that I encounter would say “no” to this question. Their Industry 4.0 challenge isn’t collecting new sources of data – it’s to do a better job of integrating and analyzing the data they have.
Their ability to do so remains handcuffed by process and test station equipment on the line that is of different vintages from a variety of suppliers. Each will claim some measure of data analysis capability, but the data isn’t easily accessed, and the functionality just isn’t there because these machines don’t communicate with each other. Data is trapped in silos, perhaps tracked by part serial number in one machine, and by date or time stamp in another. These scenarios make timely and efficient data analysis all but impossible.
6. What would you be able to achieve if you could make better use of your data?
The impact is far-reaching:
- Optimize your test stations to achieve a reliable pass-fail in the shortest cycle possible.
- Continuously monitor and improve the performance of your production stations to reduce unexpected downtime and more readily adapt them for product variances.
- Boost first-time yield by reducing scrap and rework rates.
- Provide proof of compliance to customers.
- Boost initial quality to reduce the frequency of warranty claims.
- Have the data archive for root cause analysis and traceability when warranty/quality issues do arise.
- Narrow the scope of a recall to only those parts/units that the data flags as suspect. Recall just those serial numbers.
Then use this insight to refine and tighten the limits of the relevant process and test stations, to prevent the same flaw from happening again.
Further to this, work with that historic data at any time in a simulated test bed for continuous improvement and to pre-empt future quality issues.
7. Do you need to provide better reporting, faster?
For most manufacturers that I encounter, the status quo means having to take days, even weeks, to pull reports to trace the root cause of a quality issue. Data silos are one part of the problem. Having to wade through piles of spreadsheets to manually compare and crunch numbers is the other. This is a delay you can ill-afford if a quality issue has halted production or made you doubt whether those pallets on the loading dock are okay to ship.
Given how often we hear horror stories about failed digital transformation initiatives by large OEMs or Fortune 500 companies, it’s only to be expected that small to mid-sized manufacturers don’t want to rock the boat. But migrating the production line to Industry 4.0 doesn’t demand risky wholesale change or a costly rip and replace.
Instead, the required technology investment can be additive, through vendor-agnostic data platforms that take data from various sources. Best of all, they are designed to start small and scale.
Jeff McBee is a regional sales manager for Cincinnati Test Systems.
This article originally appeared in the May 2019 issue of Manufacturing AUTOMATION.