Waveform versus scalar data
It’s simply no contest in the quality race
Mar. 21, 2017 - What is the difference between a waveform — or digital process signature — and scalar data? Take a hockey game with your favourite team. A contentious goal is scored and the referees go upstairs for a review of the play to decide if the goal will count. Now, what would you prefer as the basis for that decision: an instant replay video or snapshots of only a few isolated points in time as the puck passed through the goalie’s crease? A process signature is that full replay, while scalar data offers only a few snapshots.
What happens when you rely on snapshots alone?
We’ve seen time and time again the outcome of relying on scalar data alone to govern the quality assurance process on a manufacturing line — units that fail end-of-line tests. Productivity and profitability losses that mount as first-time yields drop and the cost of scrap and rework climbs. In these scenarios, a test station determines a part’s pass or fail based on only a few isolated data points.
The plant’s engineers and operators may be relatively confident that faulty units are being caught, but because only a handful of data points are recorded from each process, there are gaps in time. These gaps make it all but impossible to obtain a clear view of what happened throughout a process or test. If an anomaly occurs during one of these gaps, staff may never know.
For example, a bolt may have been stripped during a rundown operation because of a faulty nut or foreign debris. At one point during the operation, more force had to be used than normal to twist the nut home. If this point in time was not captured by the scalar data, the flaw will pass undetected.
The part may only reveal its fault further down the line or after it’s in the hands of the customer. There is no way to go back and trace the root cause because engineers lack that video instant replay of what happened through every millisecond of assembly and test.
Process signatures take quality control to a new level
By capturing and using the entire waveform — or process signature — generated during the test cycle, much more accurate and reliable pass/fail decisions can be made by the test station. Instead of just a few data points, pass/fail can now be determined from hundreds of thousands of data points.
Every signature from every test cycle and process on the line can then be collected into a single database. Consider it as a digital video library versus a collection of photo albums, which is all you have with scalar data alone. With such a complete and detailed historical record, that does not suffer from any gaps in time, it’s possible to search, correlate and visualize the data with algorithms to a degree that’s just not possible with scalar data. Users can find patterns that reveal how and why there are decreases in yield.
By analyzing process data, a part failure can easily be distinguished from a test malfunction. The quality team can spot anomalies that require further investigation, pinpoint where problems occur during the process, even optimize the test station by understanding how to shorten the test.
Best of all, you have complete traceability right down to the specific parts and their serial number. Track down a few dozen defective units without having to scrap, rework or recall thousands. The root cause of an issue can be tracked down in minutes or hours, rather than days or weeks.
This level of insight is simply not possible with scalar data alone. It elevates quality assurance into the realm of big data analytics.
How signature analysis works
Every single manufacturing or test process generates a process signature that can be recorded, interpreted and visualized. Signature analysis, in its most simple form, is the comparison of a specific process response or signature against an acceptable pre-determined response.
By equipping a process or test station with appropriate sensors, such as load cells, temperature, position and pressure sensors, highly consistent and repeatable signals can be obtained that directly indicate the consistency and quality of the process or product.
These signals can be physical measurements or computed values, such as horsepower or efficiency, based on specific measurements and mathematical formula. Assuming the measured variables and sensors are chosen with care, the characteristic signature will be consistent if the operation is under control and if product quality levels are maintained.
Signature analysis is the process of comparing these waveforms in a systematic and automatic manner to detect defects or manufacturing operations that are out of tolerance.
Increased yield by 170 engines a month
One automotive manufacturer experienced electronic throttle failures at the end-of-the-line engine test, an expensive time to address.
By using signature analysis, the manufacturer found the two reasons for the failures and fixed the upstream process issues to eliminate them. Seventy-seven per cent of these failures resulted from a stuck or sluggish throttle. The remainder turned out to not be failures at all, but false rejects.
The manufacturer used this deeper insight to check and verify which parts under quarantine were in fact faulty, or not.
What was the outcome? A 1.27 per cent failure rate was reduced to 0.07 per cent, increasing yield by 170 engines a month. The manufacturer also saved $330,000 annually on stripbacks and repairs.
Set the right limits for your in-line production tests
By capturing all of the station’s process signature data, you have a complete historical record that allows quality engineers to manage processes based on scientific pass/fail limit management. They can reduce false failures and optimize cycle times and production output, using data models and simulations that remove the guesswork. Historic data from the line can be revisited and reanalyzed at any time to spot trends and patterns, and to understand where tweaks can be made to help optimize a station’s performance and drive continuous improvement.
For example, a quality engineer can run a simulation using thousands of stored test records with process signatures, using new control limits to see the effect on yield. If a change is warranted, the new control limits can be sent down to the test stations. This is where process signature technology complements a traditional statistical process control (SPC) system that collects only scalar data. SPC can verify initial control limits and then signature analysis can be used for fine tuning.
Scalar data alone just isn’t enough
The benefits of relying on process signatures and signature analysis for quality control instead of scalar data alone are obvious. The only limits are the type and quantity of available data. It all comes down to cutting manufacturing costs, increasing profitability and adding impact to your organization’s quality reputation in the marketplace.
Use SPC and scalar data as you always have to monitor and track the health of your production line. But when problems arise, trust in process signature analysis to quickly find and address the root cause.
Robert Ouellette is a product launch manager with Sciemetric Instruments. He works with customers to solve pressing technical challenges in leak test and other technical areas of manufacturing.
This article was originally published in the March/April 2017 issue of Manufacturing AUTOMATION.
Fronius 10-year anniversary open house
October 25, 2017
ERP Vendor Congress 2017
October 30-31, 2017
Automation Fair 2017
November 15-16, 2017
Collaborative Robots & Advanced Vision Conference 2017
November 15-16, 2017
Additive Manufacturing Americas 2017
December 6-8, 2017
Motor & Drive Systems 2018
February 8-9, 2018