Statistical process control (SPC) has been used for many years to continuously monitor and improve manufacturing processes. However, there is a limit to the level of understanding traditional SPC quality control can provide. In many cases, users can determine that something has changed and most times even what has changed, but many engineers struggle with determining and proving the why — and the time it can take to diagnose the problem is often significant. In many highly regulated industries, such as medical device manufacturing, the standards bodies are pushing toward extremely high levels of process understanding. Key to this understanding is the concept of a “process signature.” What is a process signature? Imagine a mechanical press that takes half a second to insert one part into another part. During that half second, there will be variation in the distance travelled by the press, the force applied by the press and the rate of movement. If there are sensors connected that collect this data, a complex multi-dimensional waveform can be measured. This is an example of a process signature, which consists of thousands of data points gathered over a short time period. Across the plant floor, process signatures are constantly being generated by every manufacturing operation. The amount or type of data does not matter. The key is that the process signature data is tied to both the part under manufacture and the testing environment. Origin of data does not matter either; the process signature data could come from analog sensors, digital I/O measured states, any remote sensors, data in a PLC memory or other controller, or even senses from the product under test itself. A process signature captured for one part gives us a scientifically objective and complete representation of our test or process data for that specific part. As you run more parts through the test, you can capture more signatures and eventually you have a library of signatures that characterize the process and the product. Patterns begin to emerge. Quality and manufacturing engineers are able to visualize the control limits needed for proper quality control and compare against their calculations. We can now use process signature verification technology. Process signatures provide the most complete understanding of the manufacturing process, and they also provide a solid platform for innovation demanded in today’s competitive marketplace. In this article we will describe the role of process signatures in quality manufacturing and the characteristics of a successful process signature implementation. The benefits of process signature technology impact production in multiple ways, from consistent quality of manufacture to the ability to locate bottlenecks and improve yield. What’s process signature verification? Not all parts will have the same process signature. Process signature verification (PSV) is the comparison of a process signature against a set of control limits, which describe a set of control waveforms with upper and lower bounds, creating a “pass” envelope for a good operation. The verification process ensures that a pass or fail decision is based on the process signature being completely within the bounded envelope. Most quality engineers would agree that process signatures provide the best solution for quality manufacturing. Where many solutions fail, however, is in implementation. Applying the techniques of process signature verification to the real world is a complex task. As an example, Ottawa-based Sciemetric Instruments have spent many years of R&D effort working in real world customer environments with the ultimate goal of having operators control the power of PSV with the single press of a button. A successful test station Sciemetric Instruments have found that the best PSV solutions originate from a blend of hardware support and flexible software design. Hardware support involves the reliable collection and storage of thousands of data points per second. Data can come from plant networks over digital I/O, through protocols such as OPC, from digital encoders or from analog sensors. A test solution must be able to synchronize all of these inputs and collect data at the resolutions needed for proper signature verification. For example, Sciemetric’s sigPOD PSV provides a 16 bit A/D converter, which can sample at high frequency in order to capture every nuance of a process. Remember, a process signature is generated from any manufacturing or test process. A successful solution must support custom connection equipment (such as a fastening machine) or any other test station that can generate data. From the software standpoint, there are two important measures of success. The quality engineer must have the tools to be able to dive as deeply in to the scientific modeling environment as needed in order to find manufacturing problems and make corrections. And the operator must be able to leverage the power of PSV as simply as possible. Let’s illustrate these points with a mechanical press example: • Our test station is connected (all sensors and PLC connections made) and ready for calibration for our first press. The operator will run through a series of “good” presses so that the test station can “learn” the process signature of a good press. All control waveform envelopes are automatically generated (see Figure 1) and ready for PSV. After ten minutes of setup, the operator is ready for full production. • The quality engineer is now able to analyze the press operation from afar and graphically view the process signatures. The first analysis may be SPC verification. The ability to convert process signature information to SPC data sets (median values, histogram data, Cpk values and sigma levels) is key to understanding the design process. We now have a successful test station implementation. We could end our story here — but a process signature is a powerful thing. It is a reflection of a period of time for a specific part under test at a specific test station. We can now analyze multiple parts under test in more detail, in different combinations and scenarios. Streamline production, increase yields Figure 2 shows process signatures from 1,000 parts captured at a specific leak test station. Four types of signature groupings can be seen: 1. Compliant: well-behaved signatures, tightly grouped and normal. These parts are a pass. 2. Not compliant: failures that are obvious since they are so far away from the norm. 3. Almost compliant but still failure: in this case these are true failures that only deviate slightly from the norm, impossible to catch with classic systems. 4. Almost compliant but still failure: marked as defective, but are they really defective parts? These parts are perfect candidates for control limit tuning (described below). To determine if they’re true fails, a quality engineer can run a simulation using only Group 4 parts to see their unfiltered process signatures across the entire production line on all test stations. Did they fail other tests? Does this specific test station have too tight a control limit? One of the parts could be pulled and destructively tested for confirmation. If a change is warranted, new control limits can be sent down to the test stations. Increasing quality means increasing our understanding of the why behind any defect. Process signatures give us that why. Trends in production processes now appear. Suppose a situation arises where a part that passed test has been deemed a failure in the field. Finding root cause of the failure now becomes an objective science. Was a control limit set too loosely? Is a production machine beginning to show signs of failure? The quality engineer can isolate and highlight any anomalies. Once the root cause is found, the resolution of the problem opens new doors of quality control, such as inventory recalls. A platform for innovation Competition demands that innovation in the design lab is matched by innovation on the factory floor. The use of process signatures provides confidence that design changes are being properly reflected in manufacture. Its benefits to manufacturing systems are evident. But process signature technology is dynamic and only limited by the type and quantity of available data. Quality managers with an eye on the future will see ways to manage today’s risk while introducing new capabilities at times of their choosing. David Mannila is a senior product manager at Sciemetric Instruments Inc. in charge of the SigPOD PSV and QualityWorX product lines. Randy Martin is a technical writer at Sciemetric with more than 15 years of experience with embedded real-time software in the process control industry.
Engineers at Moxa have been designing communication network solutions that satisfy the strict, many-fold requirements of industrial automation for over twenty years. Their equipment enables power utilities to offer uninterrupted and reliable electric power to the public, even under harsh environmental conditions. But Moxa’s latest power substation automation system — an IEC 61850-3-certified, 18-port, embedded computer — presented some new challenges for the company’s designers. Moxa wanted to build a platform for substation automation that could handle a large number of LAN and serial ports while withstanding high temperatures in a fanless, 1U standard rack-mount form factor. “We also had to meet rigorous electromagnetic interference (EMI) testing requirements for IEC 61850-3, a specification governing communication networks and systems in substations,” says Moxa European business development manager Hermann Berg. “Our EMC/RFI shielding technology and purpose-built L-type heat sink that takes heat to the side rather than the top or bottom, combined with our prior experience with Intel Architecture Processors enabled us to develop this stackable computer.” The new DA-681 Series rack-mount embedded computer is designed to service the communications traffic generated by as many as six Ethernet ports and a mix of twelve RS-232/RS-485 ports. This high level of I/O capacity and flexibility is needed as power substations transition from analog to digital, which requires integrated communications and control systems for managing various equipment inside a power substation. Moxa is a leader in industrial serial communication, using its own serial technology to serve the most diverse and demanding requirements. Using Intel’s leading CPU technology, Moxa was able to build an “industrial off-the-shelf” computer system that stands up to the extreme environmental conditions of the power substation. Faster processing with less heat The Moxa design team chose Intel’s mobile product line to power their DA-681 rack-mount embedded computer because it offers high levels of computing performance while enabling a fanless solution. Further decreasing power consumption, the DA-681 automatically throttles (reduces) the operating frequency of the processor, if the system runs hot, through the use of Moxa-designed BIOS features. “With Intel processors, our energy customers have the computing headroom to run pre-installed operating systems, like Linux, Windows WinCE 6.0, or XP Embedded, in addition to executing many protocol stacks, protocol conversion routines and data pre-processing algorithms needed to monitor and control power systems,” says Mark Liu of Moxa. Measures of Success • IEC 61850-3 certification demonstrates compliance with energy industry-standard requirements such as environmental and EMI immunity for communications networks and systems in substations. • Fanless, high temperature design increases power system reliability and stability, based on Moxa’s extensive experience in building industrial-grade networking and computing equipment. •  Moxa’s five-year product warranty assures utility operators of reliable performance for years to come, facilitated by Intel’s long life cycle support for seven years. •  EMC (ElectroMagnetic Compatibility) protection ensures critical functions experience no delays or data loss when exposed to various EMI disturbances, enabled by Moxa EMC/RFI shielding technology. Integrating multiple control units Utility operators are looking for reliable monitoring solutions that perform many control functions in a single, secure box such as the DA-681. The DA-681 can be used to automate power distribution and monitor substations and service cabinets. “Instead of dedicated communication units, some power substations still use separate control units with proprietary, non-integrated data acquisition, analysis and handling mechanisms,” says Berg. “These aging units can be highly susceptible to frequent communication shutdowns, complicated maintenance procedures and may not maintain stable and reliable operations.” Simplifying energy application development Designed to meet the real-time demands of energy substation applications, the DA-681 runs Linux, WinCE 6.0 or Windows XP Embedded (pre-installed) and provides a friendly environment for developing sophisticated application software. “We offer a ready-to-run software platform, based on energy industry standards, with easy-to-use serial communication technology to significantly reduce system development effort and time,” says Liu. “This is particularly helpful for power automation system integrators as they no longer need to develop the network from the basic hardware layer.” Renewable energy trends “The move from using traditional coal-fired power plants to renewable energy sources is well underway and is expected to accelerate considerably over the next decade,” says Hermann Berg. He continues, “In particular, solar power has been recognized as a viable alternative, and in recent years a number of regions in both North America and Europe have enacted so-called FiT (Feed-in Tariff) legislation that allows individuals to sell solar-generated power to their local power utility.” At the same time, industry experts predict a number of large-scale solar power plants will emerge and sell power to consumers through existing power grids. Moxa’s embedded computers are expected to enable both efficient wind or solar power plant operation and the integration of power substation equipment into the electricity network operator’s smart grid.
Implementation of a comprehensive automation and control solution from Emerson Process Management is contributing to dramatic operational improvements at the Porcheville, France, thermal power plant.The solution incorporates the company’s PlantWeb digital plant architecture with the Ovation expert control system, Scenario simulation, AMS Suite predictive maintenance software and Smart Wireless technology, as well as its SureService customer support program. The measurable improvements – including a 50 percent improvement in the plant’s ability to ramp up and down on demand – are translating into economic benefits for the plant’s owner, EDF Group, while also helping to maintain grid stability in France.The Porcheville plant is located in the Yvelines region along the Seine, roughly 30 miles west of Paris. The plant entered service in 1968 as a four-unit, 2,400 MW (4 x 600) oil-fired peaking plant. More than a decade ago, EDF Group, the top electricity producer in Europe, shuttered units 1 and 2 due to the high cost of fuel and surplus capacity in France’s electricity market.Several years ago, in response to a growing demand for electricity in France, EDF Group began an ambitious renovation program at Porcheville designed to boost the plant’s generating capacity and responsiveness. Improving the plant’s ability to ramp up and down swiftly and accurately over a broader megawatt range enables Porcheville to compete for ancillary service contracts from the government. This not only translates into additional generating revenue, but also enables EDF Group to avoid financial penalties that would be incurred if it were unable to fulfill its contractual obligations. From a broader perspective, the ability of the plant to nimbly meet the nation’s growing appetite for power helps maintain grid stability during swings of demand, frequency and voltage.The automation and control portion of the project called for Emerson to replace obsolete Control Data analog controls with its Ovation control system for all four units. Each unit’s Ovation system consists of a dedicated controller to manage individual operations of the Alstom steam turbine and three controllers for monitoring critical data associated with the boiler and balance-of-plant processes. The Ovation system will also receive additional tank level measurements from Emerson’s Smart Wireless network, soon to be installed. Smart Wireless extends PlantWeb predictive intelligence into areas that were previously out of physical or economic reach, opening the door for new possibilities in process improvement.Despite the project’s requirements, including modernizing two units that had been mothballed for more than a decade, Emerson was able to install, test and startup the control systems within separate 18-day windows that had been allotted for each unit. Meeting this aggressive timetable was important, as it helped EDF avoid financial penalties associated with project delays.The results of the control system upgrade have been dramatic. Prior to the upgrade, EDF was only able to ramp the Porcheville units by +/– 60 MW when requested by the grid. Since commissioning, Ovation has boosted Porcheville’s maneuverability to +/– 90 MW, giving EDF the opportunity to win additional ancillary service contracts that positively impact its bottom line.The increased demand for power production capacity is EDF’s greatest concern. Even after being out of service for a decade, Emerson’s automation solution enabled EDF to put the Porcheville units back online in record time. After more than two years of operation, EDF is confident that they can reliably produce power when called upon during peak demand periods. This capability has translated into increased revenue potential for EDF, as well as vastly improved plant performance and equipment safety.At the Porcheville plant, Emerson’s AMS Suite: Intelligent Device Manager works with the Ovation system to manage smart field devices, resulting in improved production reliability. Maintenance personnel can access predictive diagnostics on all HART instruments, including Emerson’s Micro Motion mass flowmeters and Rosemount OXYMITTER 4000 transmitters, as well as Fisher® DVC 6000 digital valve controllers installed throughout the plant.As part of the comprehensive project, EDF is also taking advantage of Emerson’s Scenario simulation solution. When it is up and running later this year, the simulation will be used to train new and existing operators, strengthening their skills and ability to respond to varying situations, which will further enhance plant performance.To help ensure the plant continues to take full advantage of the latest technologies, EDF is also utilizing Emerson’s SureService customer support program, which offers access to comprehensive technical support and Ovation software upgrades for five years."A project of this scope and complexity demonstrates how Emerson is uniquely positioned to provide power generators with a comprehensive plant automation and control solution that incorporates not only the control system itself, but also simulation, field instrumentation, wireless technologies – and more," said Bob Yeager, president of the Power & Water Solutions division of Emerson. "It’s this ability that sets Emerson apart from other vendors who lack the technology, expertise or resources to successfully complete such a complex project within an aggressive timeframe."
Located on the picturesque shores of Lake Superior, Thunder Bay, Ont., is a growing community. And since it was recently ranked as one of the top ten cities for business in Canada, population is likely to continue to increase from the 120,000 citizens who live there today. Providing safe drinking water is a municipal priority. To do that, plus protect the environment, Thunder Bay set a goal to implement "lake-to-lake" water management. This means taking water from Lake Superior through the treatment process to the distribution system, and then back through the pollution control plant before returning it to the environment. In less than a decade, Thunder Bay has succeeded.A New Plant with an Entirely New Process To achieve "lake-to-lake" water management, Thunder Bay constructed an entirely new facility which is the first of its kind. While the previous plant used direct filtration with sand filters and disinfectants, the unique Bare Point Water Treatment Plant uses an advanced ultra filtration system to purify the city's water, while expanding daily capacity from 14 million gallons to 25 million gallons.With an all-new facility and an aggressive timeline, the City of Thunder Bay called on Wonderware Canada East, the local Wonderware distributor and Canadian system integrator Automation Now to assist. Challenges included integrating an existing pumping station with the new plant equipment as well as planning for future expansions. The initial facility had 12 PLCs, with 20 additional remote pumping stations to come that would incorporate PLCs from different manufacturers. Communications between the local PLCs and remote locations would be vital to the success of the project. The Clear Choice Without the ability to closely monitor and control this complicated system, the quality of Thunder Bay's water would be at risk. So it was critical to find the right SCADA (supervisory control and data acquisition) system – one versatile enough to meet the needs of the new facility plus its future expansion. Bare Point required accurate, real-time data gathering to ensure reliable control of the plant's equipment, regardless of location. Recording and logging the data, sounding alarms for threshold conditions and securely storing information were also priorities. The new system needed to be easy to use as well as provide comprehensive reports for informed decision-making by management. After evaluating the options, the Wonderware solution was recommended and approved. The Bare Point plant is controlled by a Microsoft Windows-based system utilizing Wonderware Terminal Services software located in the operations center of the main plant. Redundant servers with UPS backup systems log over 5,000 points of data, 24 hours a day, 7 days a week. The award-winning Wonderware InTouch human-machine interface (HMI) software forms the core of the Bare Point solution. In the application design phase, it provided power and flexibility as well as connectivity for the broad range of devices in the local and remote plant locations. And now the InTouch software enables operators to closely monitor pumps and control valves, and its graphics enable them to visualize the water moving through the plant. Working with the InTouch software, the Wonderware Historian provides a high-performance, real-time and historical database to integrate the operations center with the plant floor. As an extension of Microsoft SQL Server, Wonderware Historian collects comprehensive Bare Point operating statistics while reducing the volume of data that must be stored. And it integrates this information with event, summary, production and configuration data. Its scalability is ideal to accommodate Bare Point's plan for growth.For desktop-based analysis and reporting, Wonderware ActiveFactory software – part of the Wonderware ArchestrA architecture – was designed in to the system. With the ActiveFactory software, Bare Point's process engineers can spot specific trends in real time plus prepare historical reports which can be exported to Microsoft Excel. Simple point-and-click dialogs mean that plant operators can trouble-shoot problems and identify operational inefficiencies easily and quickly. Results in Record Time Wonderware software and its intuitive interfaces made the design, installation and testing move forward rapidly. Once Automation Now was on board for the project, with support provided by Wonderware Canada East, Bare Point was operational within one year. Today, engineers enjoy end-to-end control of plant processes. The easy-to-learn graphical interface enables employees in the operating center to see a real-time representation of the capacity of water moving through the facility, plus they can control the process and monitor error and fault codes from all of the PLCs. And when they leave the operating center, SCADA terminals throughout the plant enable access to the Wonderware system wherever they may be working. Plant operators rely on Wonderware SCADAlarm as an indispensable tool for maintaining water quality. If an instrument takes a reading that is out of a pre- determined range, an alarm sounds – both on the SCADAlarm screen and a plant-wide alarm system. Redundant servers secure plant data and store it for retrieval in the event of a failure. And the Wonderware Historian software's reporting capabilities enable management to maximize plant efficiency and accelerate expansion plans. One of the unique features of the new Bare Point plant is its training facility. Instructors project live views of the operations, providing a highly productive environment for learning, group analysis and troubleshooting. Proper Planning Ensures Payback Return on investment has come in record time. Real-time reporting has enabled more effective regular maintenance for reduced downtime. And historical trending reports have led to greater visibility and increased operational efficiencies. But the biggest ROI is anticipated to come as remote stations are added. Automation Now expects that development time for these additions will be cut in half. This means that efficiencies will be realized during expansion and the money saved is projected to provide payback within the next two years. With a forward-looking team and the Wonderware solution, the new Bare Point Water Treatment Plant has quickly established itself as a technologically-advanced and environmentally-conscious facility bringing clean water to the City of Thunder
The requirement of some products to be "leak-proof" can become quite burdensome for manufacturing engineers without access to adequate leak testing expertise. Far too often, leak testing technology is poorly integrated into assembly operations as an afterthought to the assembly line design.As a result, those assemblies rarely meet Repeatability & Reproducibility (Gauge R&R) requirements for leak detection, a concept to ensure stable measurements where a tester gets the same results each time they measure. Poorly integrated technology also slows manufacturing operations down considerably, with significant, though often unrecognized, bottom line impact. The following is a review of the basic approaches to leak testing, including the pros and cons of each method. Helium testingWhenever there is a need to segregate highly noxious or otherwise hazardous substances, the costs of leak testing become a secondary consideration to the exacting standards to be achieved by the testing process. Many aerospace components, for example, contain highly flammable liquids and gases, and their manufacturing operations need to verify that each and every part meets the tightest tolerances for leaks. In applications where the leaks to be detected are below 0.001 standard cubic centimetres per second (sccs), the use of a helium mass spectrometer is often advised. Helium testing usually involves pressuring a test part with helium or a helium/air mixture inside a test chamber. The chamber is then evacuated and a mass spectrometer samples the vacuum chamber to detect ionized helium. The biggest plus of this method, and usually the one that compels the use of this technology, is its accuracy and reliability. Helium mass spectrometers can measure leakage as slow as 10-6 sccs. This accuracy is largely due to the sensitivity of the mass spectrometer sensor itself, which operates under hard vacuum conditions and is essentially counting ions. These units offer reliable and consistent leak detection for leak rates less than 10-3 sccs. There is, however, a downside to using this method. Helium testing is quite costly, often double the cost of other methods. Test chamber and test circuit components are very expensive, especially for testing large part volumes. Compared to plain air, of course, the helium itself adds an expense. These costs are usually prohibitive for any application that does not require tests for leaks less than 0.01 sccm. Pressure decay testingAnother leak testing option is a method built upon measurement of pressure decay rates. In this method, a reference volume is pressurized along with the part to be tested, and a transducer reads the pressure differential between the non-leaking reference and the test part over a period of time. There are no direct leakage rate detections with this method. Rather, time/pressure readings are converted into leak rate calculations. The biggest plus of pressure decay technology, especially when compared to helium testing, is its significantly lower cost. For one thing, the cost of using helium gas is eliminated. However, the lower costs of pressure decay instruments can be extremely misleading, as the overall testing time may cause a drag on assembly line speed. This is because a pressure decay method inherently requires two measurements and elapsed time between these measurements. In typical assembly operations, pressure decay testing can increase per part assembly time by 20 to 40 per cent. In addition, the time lapse between measurements in a pressure decay test can be far more troublesome for reasons beyond assembly speed considerations. A two-step measurement process coupled with the time lapse needed between measurements significantly increases the potential for measurement error. Because measurements are highly vulnerable to changes in plant conditions such as drafts or temperature fluctuations, there are difficulties in determining the actual volume of test parts and test circuits, both of which must be known in order to calculate results. For similar reasons, pressure decay methods are impractical for leak testing parts with very large flow leaks. If the pressure drops too rapidly it cannot be measured accurately. Mass flow testingThe downsides of the aforementioned methods makes their selection an unlikely choice in the lion's share of test-centric assemblies where cost, accuracy and speed are paramount concerns, and issues of liquid/gas toxicity are not present. Mass flow sensing for leak testing offers the most accurate, reliable and cost-effective method for nearly all applications where leaks greater than 0.5 sccm must be detected, as well as many applications with tolerances in the 0.3-0.5 sccm range.Unlike pressure decays methods, mass flow methods use single-step direct measurements of heat transfer of a flowing gas from leakage flow directed across a heated element. Temperature sensitive resistors measure the temperature of the incoming and outgoing flow, and the transducer creates an output voltage proportional to the mass flow creating the leak rate measurement. In the mass flow method, a part is pressurized and any leakage is compensated naturally by air flowing into the part from the source, which can be a reference volume reservoir pressurized along with the part or an air-supply line with pressure controlled by a regulator. Testing tipsKnowing which method to use is the first step in building optimized test-centric assemblies. However, the specific way one implements the selected testing methods is also critical. It is quite common, for example, to select leak testing instruments with generic sensing devices that are not customized to an application. This approach is inherently flawed. One needs to remember that it is not the cost of the testing instrument that is important, but rather the cost of the overall testing process. Gauge R&R of testing instruments is not an especially meaningful measurement because it is the actual Gauge R&R of the testing during real assembly that counts. Fixturing on parts being tested is usually of equal or greater importance than the leak testing instrument itself. For that reason, reputable manufacturers of leak testing technology will provide in-house engineering expertise that can customize instruments and optimize tooling and fixturing design for test-intensive operations. Perhaps the most important consideration in building test-centric assemblies is in sourcing design and test engineers with proven expertise in leak testing methods. Engineering firms that specialize in leak testing will usually provide free application evaluations with recommendations for best assembly and test methods in terms of speed, accuracy and cost. It is not uncommon for re-engineered test-centric assemblies to cut testing time by orders of magnitude. This potential for throughput gains cannot be ignored. Jacques Hoffmann is founder and president of InterTech Development Company, which specializes in automated leak and functional testing. Jacques can be reached at 847-679-3377 or This e-mail address is being protected from spambots. You need JavaScript enabled to view it .
At the foot of Nursewood Road in Toronto's historic Beaches neighbourhood lies a unique art deco-style building, which Canadian author Michael Ondaatje called a "palace of purification" in his novel In the Skin of a Lion. Passersby often mistake this "palace" for a museum or an old university. It's actually the R.C. Harris Filtration plant, Toronto's largest water treatment facility, producing almost half the drinking water Torontonians consume daily.
Page 18 of 18

Subscription Centre

New Subscription
Already a Subscriber
Customer Service
View Digital Magazine Renew

We are using cookies to give you the best experience on our website. By continuing to use the site, you agree to the use of cookies. To find out more, read our Privacy Policy.