A distributed I/O network can provide a universal and modular way to connect a wide range of signal input and control output possibilities. Hosted by journalist and industry expert Peter Welander, this video shows the benefits of using a distributed I/O network to send information between instrumentation devices in their and control elements in a control room or on a factory floor. Connecting field devices to the network saves time and expenses associated with the installation and repair of wiring. The modular nature of distributed I/O networks makes it easy to add expand operations or integrate legacy process sensors. In addition, peer-to-peer systems are redundant, meaning that a break in a wire pair will not affect signal transmission.
Process reliability is one of the principal factors governing the economic operation of high-value micro assembly machines that produce miniature electronic and optical components. German manufacturer Amicra Mikrotechnologie GmbH learned that first-hand when it achieved major improvements with Renishaw’s future-oriented absolute measuring system, Resolute, which allow Micro Assembly Cells to be commissioned more quickly and, in particular, independently of the operator’s vigilance. This is particularly important for microelectronics and micro-optics applications in the automotive sector, telecommunications and IT industries. Fast feed and very precise positioning The components to be bonded and mounted, such as active / passive semi-conductors, lenses, MEMS, and processors, are picked up with the aid of linear axes and special grippers from feed stations. The components are then positioned on boards or wafers, where they are bonded with adhesives, soldered conventionally or soldered with a laser beam. Surface mounting on wafers and stack die technology make particular demands on mounting and production technology. Stack die technology is used to construct three-dimensional memory and computer structures; semi-conductors are not only mounted horizontally next to each other and connected (SoC), but also vertically in several planes (TSV). Further miniaturisation is achieved as a result of a higher packing density. Amicra develops and manufactures Micro Assembly Cells designed for this purpose. These machines are known in particular for their high levels of accuracy and reliability. Using a large array of horizontal and vertical linear axes, they surface mount wafers with practically no unproductive downtime. While a component is being positioned, a second handling unit is already picking up the next component from a buffer store. At the same time, the work tables and other linear axes with lasers and UV lamps move to the positions required for the soldering and bonding processes. In addition, other axes position the integrated process monitoring cameras. Collisions absolutely ruled out As Horst Lapsien, managing director of the German company, explained, the high process reliability of these installations is particularly instrumental in their economic operation. A crash between the alternate positioning grippers and linear axes must be avoided. “This can be achieved through the extremely precise programming of the motions,” Lapsien says. “Furthermore, the measuring systems on the linear axes have to detect the current position of the slides reliably and very precisely.” This could only be achieved to a limited extent with the incremental measuring systems used previously. “In the past, starting up the production and mounting cycle after a stoppage was unsatisfactory because the readheads of all the linear axes had to travel to a reference position first. Only in this way was the control system able to detect the actual position of the axes, but that took an unnecessarily long time,” Lapsien adds. Also, starting from reference positions from an undefined position of the slides represented a significant source of error. If the operator has not analyzed the crash paths first and selected the reference travel cycle accordingly, the installations can suffer considerable damage as the result of the collision of grippers or even of the gantries. This leads to unnecessarily long unproductive times, unnecessary costs and uneconomical downtimes of the entire machine in the event of a crash. Absolute measurements without reference run Thanks to the absolute encoders, Amicra’s mechatronic engineers have been able to improve their machines considerably. The main advantage of these measuring systems according to Lapsient is that they detect the absolute position immediately at switch-on, without a reference run. Consequently, the Micro Assembly Cells can start their automated cycle quicker, without relying on the operator’s vigilance, after a stoppage or on commissioning for the first time. Process reliability is improved, unproductive times are reduced and expensive crashes are prevented. The RTLA absolute measuring scale can be bonded directly to the substrate or inserted into a special guide, both made from stainless steel. As a result, it easily installs compactly inside the Amicra machines, achieving ±5 µm/m accuracy. The tough stainless steel tape scale is highly resistant to damage, but if necessary, the guide allows easy scale replacement at any time. The encoders feature a unique position detection method, analogous to a very high speed digital camera, capturing very high resolution images of the scale. These images are then analyzed by a powerful Digital Signal Processor (DSP) that applies cross-checking and error-rejection, to determine position to 1 nm. Combined with a built-in position-checking algorithm, the encoder has high immunity to contamination. Lapsien says taht since their introduction into three-shift production operation, his machines have recorded zero stoppages on account of reading errors of the measuring system caused by dirty measuring tapes. The advanced detection technique also enables the absolute encoders to achieve high levels of accuracy with just ±40 nm cyclic error and jitter lower than 10 nm RMS. The result is excellent positional stability and a very low noise level. Therefore the Micro Assembly Cells equipped in this way benefit from higher levels of reliability and performance.
Scott Hale, IMAGINiT Technologies director of Consulting Services, Manufacturing Solutions Division, describes the 4/50 rule at Autodesk University 2012. The 4/50 rule states that 4% of existing business processes, including design and manufacturing, typically represent 50% of the inefficiencies. Hale talks about how to recover productive design time and to streamline workflow by identifying and eliminating wasteful processes.
The process safety systems market growth has rebounded after the world economic downturn, especially in the developed economies of North America and Western Europe, according to a new market research study from ARC Advisory Group.“Process safety systems suppliers continue to cost reduce their hardware offerings and integrate their safety solutions with basic process control systems. Suppliers offering a truly integrated offering of process and safety are saving end users substantial project costs in engineering and lifecycle expense,” said principal analyst Barry Young, the principal author of ARC’s “Process Safety Systems Global Market Research Study.”The main automation contractor (MAC) business model for executing projects is finding a growing acceptance with integrated process safety systems. The lack of process safety experts requires creative project team sourcing with end users, suppliers, and specialty safety engineering firms.New technology trends include smart configurable I/O and distributed safety system architectures. New software tools are increasingly making the systems easier to implement. The past emphasis on controller architecture (Dual, TMR, QMR) has diminished with more emphasis now placed on the overall process safety system solution and certification. Cyber security’s importance cannot be underestimated, particularly for safety and security of industrial processes. Closer integration of safety and security in process safety systems is required.
Electronics manufacturers share a common goal: to produce the highest-quality product possible. What they don't always share are the processes required to achieve that goal. To ensure that quality is maintained from start to finish, the cost of marginal quality is constantly monitored and evaluated, especially with regard to internal process controls. Recently, component "tombstone," a recorded internal process control defect, has emerged as a matter of growing interest.Clearly, circuit card assemblies (CCA) that demonstrate tombstone characteristics require special attention, due in part to the high risk of unacceptably long delivery times they can cause and the lengthy rework times associated with printed circuit board (PCB) repair and the need to replace parts. Because no direct cause of — or resolution to — any single aspect of tombstone had ever been conclusively identified, an in-depth investigation was initiated to identify the cause, or perhaps causes, and ultimately resolve tombstoning. After diligent study, both cause and resolution have now been identified.As our effort began, several key questions were identified: Are process defects the responsibility of manufacturers? If so, or if not, why? Is changing a CAD library for each manufacturer always sufficient to avoid the problem of tombstoning? Aren't quality, on-time delivery and cost solely the responsibility of manufacturers?'Tombstone' compromises solder connectionTo adequately discuss these and other issues associated with tombstoning, a working definition is needed. In essence, tombstone is a defect where a two-leaded, wrap-around lead-style termination chip component fails to lay down appropriately and allow the solder to simultaneously make a required electrical/mechanical connection to the target pads. As a result, only one side of a two-leaded chip component may be soldered to the target pad and the sister termination of the chip component may not come in contact with the target pad. These results are unacceptable, both to the manufacturer and to the customer.In such instances, it is fair to assume that with one side of a two-leaded component not adequately soldered, both sides would be problematic. The end result of such a circumstance is often an additional defect — a missing part.This necessitates that special attention be paid to ensure that additional defects on the component or on the PCB pads do not occur. To prevent this from happening, the assembly should be rerouted from its intended next step to a rework station following surface mount placement and reflow, where inspection, reworking and repair might occur.Procedural missteps a detrimentUnfortunately, procedural missteps can occur at any point in the manufacturing process and a defect may not be reported to the customer. To determine whether the defect is process- or design-related is a challenging task fraught with great difficulty. Preliminary assumptions can range from poor manufacturer process controls, to failure to ensure sufficient solder volume and deposition, to target pad insufficiency, to the need to incorporate a home-plate design, to the need for CAD designers to create specialized pad geometry (known as home-plate design). Each assumption must be thoroughly evaluated if the tombstone cause is to ultimately be eliminated.Certainly, surface-mount technology (SMT) process controls, pad geometry and stencil design may all be factors in the development of a tombstone effect in products. However, essential questions still remain: If the process controls are tuned, what causes the chip component to tombstone? And, if the stencil and pad geometry are of home-plate design, why is the tombstone effect still present? We felt these questions needed to be answered.Prior investigations inconclusiveIn hopes of eliminating the tombstone problem altogether, studies have been initiated that looked at many potential factors, including solder composition — specifically, varying grain sizes; tin/lead (SnPb) vs. lead-free; and dual reflow solderpaste, where one alloy would reflow and secure the chip component into position prior to the second alloy reflowing to make the electrical connection.Over time, IPC studies have turned up some interesting data, ultimately finding that the IPC-SM-782A (1993/1999) was in error on many fronts.As a result, the standard was rewritten and is now known as IPC-7351B (2005/2010) Generic Requirements for Surface Mount Design and Land Pattern Standard. To ensure consistency and stability throughout the production process, an equation designed to calculate the proper pad geometry was included in the document; three acceptable density levels, each based upon the component population density of the PCB, also were added.These important changes all helped to mitigate the tombstone effect. However, the overriding goal is to eliminate this lingering issue altogether, which despite best efforts has remained a significant concern across the industry. Until now, despite exhaustive investigative efforts, the cause of chip component tombstoning has remained largely unsolved — and therefore unresolved.In retrospect, one aspect appears to have been overlooked in the investigation process: the physical component itself. Although physical components may comply with the standard — i.e., the EIA 0402 standard — the physical characteristics of a component must be well understood and eventually ruled out as the culprit.Multiple root causes likelyTo test for this, a dimensional comparison was performed on the seven-capacitor and six-resistor component most commonly used by our own customers. The analysis, which examined the component body and its terminations, revealed that six out of seven (86 per cent) capacitors and three out of six (50 per cent) resistor manufacturers used different body and termination dimensions and tolerances. This finding is significant.To validate that discovery, the same comparison was performed using identical component manufacturers with respect to the EIA 0201 and EIA 0603 package type. In all cases, each manufacturer produced an identical part, indicating that EIA 0402 is only one facet to the root cause of the festering tombstone issue.In light of existing research data, it is now apparent that the tombstone effect is clearly not limited to one root cause. In fact, many factors may have influenced its occurrence over the years. With that understanding, we believed that a proper solution could logically be found.To proceed further, it should be noted that three major factors influence tombstoning: component, layout and solder flow behavior. Understanding the relationship between these three factors is critical to ultimately understanding and resolving this important issue.Note that component equals mass, body/termination size, and associated tolerances; pad geometry equals size/spacing, IPC standard, and CAD library. Additionally, copper exposure equals stencil/solder volume, solder mask clearance/coverage, and solder mask-defined pads verses metal-defined or non-solder mask-defined pads with respect to signal trace and planar connections. Finally, copper density equals solder mask-defined pads verses metal-defined or non-solder mask-defined pads with respect to signal trace and planar connections. Via-in-pad designs and balance also are considered.Component differences should be consideredWhile physical component dimensions may differ from manufacturer to manufacturer, and should certainly be considered a part of the ultimate equation, so should the fact that the mass of the component also is a factor: the lower the mass, the more susceptible it is to the influences of solder flow behaviours. EIA 0402 is considered a low-mass component, thus solder flow behaviours must be controlled and implemented from a layout perspective.It was noted earlier that IPC has rewritten the old IPC-SM-782 standard, resulting in IPC-7351. While this standard appears to have the best layout for EIA 0402’s, when the physical body and termination dimensions of components were superimposed onto the IPC-7351-recommended pad geometries it was found that in some cases the recommendation did not accept the component as fabricated. In fact, the pad geometries appeared to have been focused strictly on the nominal component body dimensions and did not take into account certain other important tolerances.In contrast, IPC rightly focused strictly on the termination tolerances and variations with regard to part length. The width was found to be in error and only the nominal conditions were stated. This error was uncovered during evaluation of the component manufacturers and should not be disregarded.Comparative equality a necessityFor additional clarity in understanding the IPC-7351 standard, the pad geometry is only good for the chosen component. If other manufacturers and components are selected, the physical dimensions must be identical for relevance to be achieved. Any variation in the component when the pad geometry is not designed for that component may result in the dreaded tombstone effect. Clearly, proper pad geometry is critical in controlling the behaviour of the tombstone effect. Since IPC has taken the initiative and rewritten the standard, it is highly logical that the designers' CAD library might lag in updating that important aspect and that all libraries are therefore suspect to remaining in non-compliance. To ensure critical industry compliance, the validation of compliant pad geometries is ultimately necessary. An EIA 0402-recommended pad geometry was developed to maximize the exposed surface area so that all component variances are accepted while the amount of surface needed to achieve the critical goal is minimized. Our own EIA 0402-recommended pad geometry, as one example, is excellent for multiple components, as typically selected alternates are used on bills of materials.With a clear understanding of the criticality of a proper pad geometry and its relationship to the EIA 0402 physical component, the focus must shift to solder flow behaviours and their effects on low mass components. Copper density often overlookedCopper density is the amount of copper required to heat to a temperature sufficient to reflow solder with respect to a component termination and its target pad. In the simplest approach, this definition is often overlooked or even sacrificed in terms of electrical considerations. However, with respect to low mass components, pad geometry thermal balance is a key factor in the tombstone effect.The MD pad has a much lower copper density compared with the SMD pad. Therefore, the MD pad heats sooner than the SMD pad. In other words, the connecting trace acts as a thermal heat relief to the MD pad, confining the necessary heat to the target pad. Regarding the SMD pad, no thermal heat relief is present; thus, all of the necessary heat for the target pad is sinked away to the plane.With consideration for low mass components, thermal balance for the pad geometry is a key factor. Each pad must be considered as a group and not independently.With the clear understanding that many factors contribute to tombstoning, including component, pad geometry, copper exposure, copper density and solder flow behavior, balance is essential to solving the low mass component issue. • Pad geometry: Must be present for many component manufacturers to be accepted.• Layout: Each pad must be treated as a group rather than independently.• Copper exposure: Each pad must be nearly equal. • Copper density: Each pad must be nearly equal. Therefore, a three-step approach, where each is used in concert to resolve the tombstone effect, is recommended: 1. Use of optimal EIA 0402-recommended pad geometry is one way to provide the best layout solution in order to accept multiple EIA 0402 component manufacturers with a minimal amount of PCB real-estate required.2. Set soldermask clearance to 2mils.3. Use a connecting trace between pad and plane — or a very wide trace — equal to the sister pad. This investigation successfully resolved an age-old issue that has long lacked resolution. With an understanding of each aspect and their relationships to each other, it is apparent that balance is the key to providing consistency from solder joint to solder joint. The implementation of optimal pad geometry and layout provides a range of benefits, including:• The acceptance of multiple EIA 0402 component manufacturers while using only one pad geometry.• Use of minimal real-estate.• Balanced copper density.• Balanced copper exposure.• Reduction of the tombstone effect to virtually zero, including associated defects such as PCB damage and missing components.• Broadening the layout concept beyond EIA 0402.• Improved solder joints regardless of component termination.Eric Reno is a product engineer at Suntron Corp. Learn more at www.suntroncorp.com.
By definition, the off-the-shelf modular instruments (VME, PXI, VXI, CompactPCI, PCI, etc.) used to create automated test and measurement systems are designed to be general purpose, programmable and flexible enough to handle a variety of input ranges and types, speeds, and functions. At first blush, these modular, off-the-shelf instruments may seem ready-to-go for instrument manufacturers’ or test system designers’ needs. However, 100 per cent COTS are generally inadequate for complex, commercialized measurement systems. Often these instruments, built into systems within a chassis or PC, are used to build systems that are used in a laboratory or R & D environment, with techniques or information that was not previously available. “We build flexibility into our cards so they can deal with multiple conditions,” says Steve Krebs, of KineticSystems, a company that manufactures CompactPCI/PXI and VXI data acquisition (DAQ) modules for test and measurement applications. “But we can’t anticipate everything, particularly in cutting edge applications, so that’s where customization still enters into the equation.”Although nearly 90 per cent of the company’s revenue comes from off-the-shelf module sales, Krebs notes than an increasing number of customers require some level of hardware, software or firmware modifications to fit the application.“With modular instruments there is this idea that you can buy different pieces from Vendor A, B and C and stick them together in a system and that’ll be it,” says Krebs. “But once you put these modules together in a chassis, there can be issues with interoperability, input ranges, synchronization, signal amplitude or conditioning and other performance characteristics.”And while an engineering staff may be able to handle some customization, modifications can be a time-consuming, expensive activity that consumes resources and detracts from a company’s core activity.That’s where some customization from the product manufacturer can come in handy. KineticSystems recently installed an automotive component testing application. The system was for an automotive component level development and testing division of a large company that supplies products to many different industries. The project involved an upgrade of the company’s current data acquisition system for testing component-level assemblies for automobiles, in this case accelerator pedals. The company required a COTS solution to replace existing instrumentation for lower cost and the same performance as its existing system. The ATE system was to be installed in multiple plants throughout the United States, and later fully standardized for export to other countries. The testing process involves temperature-controlled test chambers to simulate the worst-case component environments. The accelerator pedals are mechanically cycled 24 hours a day for up to several months while the data acquisition system records the position and monitors the motion profile to ensure the pedal is performing as designed. In some instances, stress is also measured on the component under test. The original ATE system developed for this purpose was a proprietary, non-standards based system no longer supported by its manufacturer. Later, the system was updated, but was still largely proprietary.“Every time they purchased a new system they made incremental improvements, but never came up with a standardized solution they could easily duplicate,” says Krebs. “Now they were interested in a more economical solution that took less real estate and that they could standardize on globally.” The answer was a PXI-based solution, using four modular off-the-shelf instruments in a rack-mounted enclosure. Each instrument was a flexible 8-channel CompactPCI/PXI module with signal conditioning and ADC. The company also wanted to perform more frequent calibration of the instruments in its own metrology labs. Typically, calibration is performed annually at the instrument supplier’s facility. However, calibrating in house would minimize downtime and expense, so the team developed a standalone software application to perform periodic calibration with pre-calibration and post-calibration report generation for NIST traceability. The customer also required the ATE to have the built-in flexibility to perform ad-hoc data acquisition experimentation without having to write any code. This was achieved through configurable data acquisition software, which provides access to all of an instrument’s capabilities and features through a simple point-and-click GUI.The software was further customized to include the ability to specify modules as master/slave to allow for simultaneous acquisition of multiple transducers signals between multiple modules.
Modern HMIs are more than just high-tech replacements for push buttons. They are tools with advanced capabilities that allow them to increase equipment productivity at little or no increase in the cost of the HMI. However, achieving that added productivity depends, in large part, on choosing the right HMI for the job.
Touchscreen monitors are everywhere. The great thing about them is that they are extremely easy to use. A touchscreen functions like an invisible keyboard, but it displays only as much data and button choices as users need to complete a task. That explains their popularity in devices from ATMs to mall kiosks and hospital operating rooms to complex industrial machinery. The most important decision in selecting the best touchscreen monitor for your application will be the type of touchscreen technology to use. There are several types, each with its own advantages and disadvantages. The three most common types include: • Resistive technology A resistive touchscreen monitor is composed of a glass panel covered with thin conductive and resistive metallic layers, separated by a thin space. When a user touches the screen, the two layers touch at that point. The computer detects the change in the electrical field and calculates the touch point. Resistive touchscreens are generally the most affordable, but they only offer approximately 75 to 80 percent image clarity. The touch can be activated with nearly any type of object (stylus, gloved finger, etc.), but the outer surface can be damaged with sharp objects. Resistive touchscreen panels are not affected by dust or water on the surface; they are the most common type used today. • Capacitive technology In a capacitive touchscreen monitor, a layer that stores a continuous electrical current is placed on top of the monitor's glass panel. When an exposed finger touches the monitor screen, some of the electrical charge transfers to the user. This decrease in capacitance is detected and located by circuits located at each corner. The computer then determines the touch point. Capacitive touchscreens are a durable technology that is often used in kiosks, point-of-sale systems and industrial machinery. Capacitive touchscreens have a higher clarity than the resistive type (88 to 92 percent), and have greater endurance (up to 225 million touches) than a resistive type. However, capacitive screens can only be activated with an exposed finger (no gloves, pointers, etc.) • SAW technology SAW (Surface Acoustic Wave) touchscreen monitors utilize a series of transducers and reflectors along the sides of the monitor's glass plate to create an invisible grid of ultrasonic waves on the surface. When the panel is touched, a portion of the wave is absorbed. The receiving transducer locates the touch point, and sends this data to the controller. SAW touchscreens have no layers on the screen, thus enabling more than 90 percent image clarity, and can display high-detail graphics. They can be activated by a finger, gloved hand or soft-tip stylus. However, SAW panels are the most expensive of the three, and contaminants on the surface (moving liquids or condensation) can cause false triggers. Solid contaminants on the screen can create non-touch areas until they are removed. Other considerations Other factors to consider in your selection process include: Interface: Touchscreen panels must communicate with the computer. The most common interface types are RS-232 and USB. New HID-compliant touchscreen monitors eliminate the need for drivers. Mounting: Options include panel mount, rack mount and free-standing. If free-standing, be sure that it uses a heavy-duty stand designed for touchscreen; standard tabletop bases will topple over. Environment: Touchscreen monitors are available in standard, stainless steel and waterproof enclosures for a variety of environments. Screen size: Touchscreen monitors are available from 3.5 to 52 inches. The most common sizes are 15 to 19 inches, and 32 to 42 inches for large control rooms. The aspect ratio (4:3 or 16:9) should also be considered. The type of touchscreen monitor you select will be contingent upon many factors, including type of data to be displayed (video, graphics, text), the intended users, the operating environment and where/how it will be mounted. Chosen correctly, touchscreen monitors will be an excellent addition to your system. Written by Herb Ruterschmidt, TRU-Vu Monitors, Inc.
"Tastiest Canadian beer." "Ontario's best micro brewery." "Best locally produced beer." These are just some of the accolades that Steam Whistle Brewing has earned since the first beer came off of its production line 11 years ago. With such praise, it's no wonder that the brewery's communications platform is to "do one thing really, really well" - a slogan describing the company's singular focus of making one beer of exceptional quality that Canadians can be proud of.
What are latest trends in HMI? What should you look for or be aware of when you browse the market in search of new HMI solutions? The purpose of this white paper, from Beijer Electronics, is to outline and describe some of the most recent key trends in software-driven industrial HMI solutions. Trend 1: HMI as an integrated part of a user experience The significance of user interfaces has become increasingly clear over the last years. Think of Apple´s products (e.g. the iPod or the iPhone) as iconic examples of how appealing and intuitive user interfaces have completely changed the perception of particular product types. The success of Apple´s products and other consumer-oriented merchandise clearly shows that a common look and feel among the products, graphics and environments contribute to brand distinction and consistent customer experiences. Many industrial corporations have reached the same conclusion and are starting to focus more on the quality of the user interfaces in their products. In many ways, the HMI is the front of a machine or process. The higher level of functionality and interaction embedded in the HMI, the more the user interface reflects the essential experience of a machine or process. Tomorrow´s successful HMI will lift the concept of a HMI solution from merely being a functional add-on, to becoming an integral part of a user experience by adding the right look and feel. Design features will include the use of WPF (Windows Presentation Foundation) objects, scalable to whatever size, without loss of picture quality, and the use of .Net objects found or purchased on the Internet. The use of templates and object styles facilitate efficient ways to ensure consistent, reusable design. Embedding of all functional objects, including Windows media objects, in the desired screen design will further enhance a positive user experience. Besides the competitive advantage for a machine builder, there are solid arguments even for end users to justify the investment in the development of well designed intuitive user interfaces. The value addition of tomorrow´s intuitive HMI solutions is reflected in ease of use, higher efficiency and productivity, reduced time to complete tasks, improved user satisfaction, trust in systems, and fewer user errors. Trend 2: Innovation based on modern best practice software technologies The HMI evolution is driven by continuous software development backed by robust high-performance panel hardware. Today, the panel hardware is considered as a vehicle for the HMI software platform, allowing OEM design engineers to add value to their corporation´s products with a variety of options for functionality and design features. The software platform Is, therefore, a crucial element of a HMI solution. HMI software development is a costly and complex matter, and innovative HMI manufacturers will need to base their software platforms on modern, widely spread technologies like .Net technology to be able to access a sufficiently broad variety of tools and functionality. The same argument can be applied to the future maintenance and development of the technology platforms of HMI solutions. The resources behind .Net are enormous, which will be reflected in the continuous development of the new functionalities in the HMI software. Dependence on proprietary technologies or technologies from smaller vendors must be considered a unique approach, but a risky strategy. HMI solutions based on Microsoft´s .Net framework or similar technologies are likely to be able to guarantee an innovative future-proof tool with continuous updates and service support highly appreciated by OEMs with long-term strategies for their own products and external suppliers. Trend 3: Open platform architecture for integrated solutions HMI basically integrates the operation of a machine or a process with the feedback to or from the operator. One aspect is the quality of the graphic user interface and, in connection to this, the usability. Another important aspect is the openness of the HMI solution. Is it easy or difficult to exchange essential information with different systems or controllers? Is the application code locked for customization of functions or objects? Will runtime software be able to operate on different hardware platforms? Are design engineers able to use standard .Net objects in their projects? These are issues frequently discussed between customers and vendors. The open platform architecture of tomorrow´s HMI solutions will offer a wide range of opportunities for OEMs to enhance the look, the functionality and the connectivity of applications in order to catalyze unique products with substantial integrity. HMI solutions will be less proprietary and offer increased freedom in choice of runtime platform, from compact operator panels to industrial PCs from different manufacturers. It will be possible to create a scalable master project, which can be applied to different controller brands and panel resolutions with the advantage of only having to maintain one project. Engineers will demand opportunities to use scripting tools (e.g. C# script, to customize the look or functionality of objects). The design tool will offer the possibility to import third-party objects and .Net controls. Freedom in connectivity and communication is the hallmark of a truly open HMI solution, and will include a variety of options ranging from simple real-time exchange of data between controllers up to SQL and OPC communication with other equipment and IT systems. Summary HMI solutions are in a state of change. Industrial user interfaces take inspiration from consumer-oriented products like mobile phones, MP3 players, etc., with advanced 3D-style graphics, icon-based navigation and controls, resulting in user-friendly and intuitive user interfaces. Trend-setting HMI solutions will support this mindset with state-of-the-art graphics and functionality fully embedded, providing well designed intuitive user interfaces based on flexible, widely spread, modern software technologies and a true open platform architecture. Graphic user interfaces do not necessarily have to include the use of advanced graphic solutions. Simplicity and consistency often beats complexity and overly artistic solutions. However, the design process very often benefits from co-operation between graphic designers and the application engineers. www.beijerelectronics.com
EAO Switch Corporation, a Milford, Conn.-based global supplier of HMI components and systems, has released a white paper on the effective design of HMI systems. Entitled Human Machine Interface Systems, the paper reviews the importance of good HMI system design and its link to the success of products in all market segments. It also encompasses the different factors that should be considered in creating an HMI system, including how to define HMI functionality and identify who the operator will be; selecting the appropriate control technologies (from pushbutton to touchscreen); selecting the appropriate communications technology from hard wiring to bus implementation; identifying and meeting relevant standards; and applying HMI best practices to different applications from medical to industrial. The paper reviews the importance of a well-designed HMI system, which gives an operator active functions to perform and get feedback on the results of the operator's actions, and information on the automatic functions of the system as well as its performance. Also reviewed is the way in which HMI control panel assemblies and subsystems integrate a wide variety of switching, data input, indicator, alarms, communications, and other electromechanical components and technologies. For more information, visit www.eao.com.
Automation technicians are constantly challenged to keep instrumentation loops and I/O working at peak efficiency in the least amount of time possible. While multiple tools have typically been needed to perform various troubleshooting tasks, today's multi-function instruments, such as mA process clamp meters, allow technicians to perform a wide range of tests with outstanding accuracy and efficiency, while cutting down on the number of instruments needed to do the job.
Migrating an agrochemical producer's existing installed base of pH analyzers to Endress+Hauser's Memosens and Liquiline technology platforms increased pH data reliability, decreased downtime, and increased production, according to Endress+Hauser. Maintenance labour was reduced by 50 percent and consumable usage was cut by 60 percent, producing savings of more than $450,000 US per year. The agricultural chemical manufacturer produces and supplies agrochemicals for worldwide markets. The manufacturing plant uses pH measurements to monitor and control reactions at more than 30 locations in utilities, incinerators, scrubber effluents, pH quench tanks and wastewater final effluents. Additional pH measurement and control locations are used in chemical synthesis reactors on recirculation lines to confirm and/or control pH for chemical synthesis. Depending on the criticality of processes, some pH measurements required closer attention and more maintenance than others. For example, the 80 units installed in the phosphatization and sulfonation processes required calibrations twice a week. Typically these calibrations required about 45 minutes to perform, resulting in approximately 120 hours per week or 6,240 hours each year to maintain calibration. The agrochemical producer made immediate improvements by replacing existing pH electrodes with Endress+Hauser Memosens pH electrodes with Teflon reference junctions. Included with the new electrodes was a new electrode holder and cabling technologies that eliminated previous reliability problems, such as poor connections and resulting electrical faults. The Liquiline platform that supports electrode lab calibration away from the field was also installed. With the use of this platform, the customer was able to rotate electrodes every two weeks with pre-calibrated electrodes. These electrodes also lasted longer - several months as opposed to a few weeks. The Liquiline platform extends to the lab, so calibration no longer needs to be performed in the field. Electrodes can now be continually cleaned, calibrated and even regenerated under controlled conditions in the lab. As a result, technicians can replace installed electrodes with calibrated ones in a fraction of the time. Endress+Hauser's Memosens technology increased the plant's average usage per electrode from 748 hours to more than 2,500 hours. With these changes in place, yearly maintenance hours decreased from 6,240 to 3,200, for a net savings of 3,040 hours. Through proper electrode, holder and cabling selection, and the use of Memosens and Liquiline technologies, the customer saved more than $450,000 US per year in labour and materials to maintain its pH systems. Numerous opportunities for saving time, cost and materials in many other processes were identified throughout the plant, and implemented by replacing outdated analog pH loops with Endress+Hauser's inductively coupled Memosens and Liquiline technology platform.
A new era of the human-machine interface (HMI) is upon us. Harsher safety and compliance standards have persuaded many operating companies to replace their "traditional" graphics with high-performance HMIs - a move that is sure to make life easier for plant operators everywhere.
Make IT Secure 2019: Cybersecurity in Manufacturing
April 25, 2019
Partners in Prevention 2019
April 30-1, 2019
Advanced Design & Manufacturing (ADM) Canada
June 4-6, 2019
PDTA Canadian Conference
June 5-7, 2019
APMA Annual Conference & Exhibition 2019
June 12, 2019
Avnet IoT Workshop
June 16, 2019