Dec. 3, 2014 - Growth in the global product lifecycle management (PLM) software market continues to depend strongly on regional economic and industrial trends and is experiencing significant variability. Currently, there is significant business activity across certain global regions, most notably North America, and industrial sectors, most notably upstream oil and gas. While overall growth in the PLM market slowed slightly from 2012 to 2013, some offsetting factors and technology trends will drive growth in the PLM market in the coming years, according to a recent ARC Advisory Group report, Product Lifecycle Management Global Market Research Study.
Oct. 24, 2014 - If you want to be a programmer, but don’t want to spend the money on the tools, you have some options.
Sept. 29, 2014 - AutomationDirect’s “Point of View” (POV) Windows-based software is a feature-rich industrial HMI with SCADA tendencies designed for machine control as well as plant floor enterprise. To be clear, AutomationDirect has licensed the core of a third-party technology and value-added to it to create POV.
Dream Report, from France-based Ocean Data Systems, is marketed as a real-time reporting generator solution for industrial automation. It is interesting to note that the company is trying to get Dream Report recognized as the global standard for reporting in automation. This is a new development, and while it is unclear to me what this really means, it is worthwhile to examine the technology.
My last column dealt with Rockwell Automation’s Connected Components Workbench (CCW). But there is much more to say, so this is Part 2 of my review.
I have often wondered, “What’s the big deal about safety? There can’t be that much to it!” In fact, I’ve actually said those words to Kevin Pauley, the Canadian manager for Pilz Automation and Safety, whom I have known forever. And that’s when he educates me.
I have been in the PLC and software game since 1977. I have seen a lot of things transpire between then and now when it comes to programming software.
In past columns I have covered many HMI/SCADA products that have a web browser interface for various devices. And, with the advent of the cloud, there may be many more reasons to have a browser interface available to you and the plant floor.
Well the HMI marketplace may have taken another turn again. We have always had definite purpose devices in the past, but in recent years the industry has moved away from that to a more flexible can-do-anything approach.
This isn’t a software column as such. But it is. I had just finished writing a four-part series on technology and the effects on our economy when, suddenly, a thought struck me. We have been using PLCs for 40+ years. There are tons of things that have changed and some things that haven’t. Any PLC or PAC is doing the same job as it did years ago, but now it’s for a different reason—business. The business of technology used to be fun. In my last column, I talked about the Raspberry PI and how much fun it’s going to be to get back into a bit of banging mode. So far it’s not all that much fun because I am realizing that I am far behind the curve. I grew up with PLCs, as did most people my age in my industry. The changes that occurred were phenomenal. We went from a relay-replacement expensive slow appliance to a compact, fast, “I can do anything” device/node. Before laptops, the industry used definite purpose devices for interfacing. Remember that, all along, PLCs used a programming language—you just needed a device to enter it. There are varying accounts about how stuff got started, but Dick Morley and his company Modicon (Bedford Associates at the time) created a technology appliance that could do what the big boys were asking: replace relays. The technology was sold on troubleshooting and the fact that anyone in your maintenance department who knows relay logic can program and work with PLCs. Why? The language to program them was ladder logic, the same representation that an electrician has seen for years on D-size drawings. In the ‘90s there were individual software companies popping up everywhere. But after so many years and permutations and false starts on successors, the PLC (and its ladder logic) is still the king. Morley wrote in the mid ‘90s that ladder logic will erode in time. Seems it hasn’t. It works—plain and simple. There are more than 200 vendors of PLC hardware. Not one of them offers a non-ladder programming solution. Remember fuzzy logic? Today, it’s nowhere to be found. So here we are. Companies are delivering solutions with PCs, dedicated hardware, fancy screens and a full array of options. You can even connect to your nuclear controller over the Internet! Comforting. But the one consistent theme in all of this is ladder logic. It’s simple, robust, extendable and, having learned it, it provides a career that may more sustainable than most, or at least some. What’s old is new again This article originally appeared in the June 2013 issue of Manufacturing AUTOMATION.
Raspberry PI Single Board ComputerCost: from $35.00Vendor: Element14.com While my columns deal with software exclusively, I have often paired hardware and software where I believe can provide benefit to you, the reader. I first ran into the Raspberry PI single board computer (SBC) six months ago. I had been looking for a Windows Embedded SBC for some work I was doing. I didn’t want to spend an arm and a leg, but it seems you have to if you want Windows. The PI is a Linux-based, ARM-based processor board with HDMI video, network connectivity, sound and USB ports. It also has headers for special purpose interfacing. Installing the OS on the SD card was fairly simple, even for me! Using a Windows-based computer, the OS image file is written to the SD card. Insert into the PI and fire it up. The PI boots to a command prompt after you log in. It has a built-in X-Window interface and can be invoked by typing ‘STARTX.’ You can set up the user as an admin, which allows for file manipulation, or as a basic user. The really cool thing about this environment is that it can take you back to the days of being able to really control the device at a register level. This may allow you to do various things that you may not have been able to do before. The amount of community support for Linux and PI is astronomical. I believe that the PI was originally intended as a STEM (science, technology, engineering, math) educational device and for hobbyists, but it can do much more than control a crane. How about 3-D printing? Or how about a low-cost device for kids at home to run a virtual session on a VMWare server? Or how about a remote terminal? The Linux OS command set is vast. Depending on the image of the OS you install, it will determine what is available. ‘-VI’ is the editor for text files for instance, so it really isn’t intuitive. While the PI may be child’s play, the STEM community can use it as a springboard for young ‘uns to learn a bit more about engineering applications. Want ice cream with that PI? This article originally appeared in the May 2013 issue of Manufacturing AUTOMATION.
Well, I left you high and dry from my last column on VMware Hypervisor, for which I apologize. I hope I don’t disappoint you with the next installment on the implementation of the Hypervisor server. I left you with the Hypervisor installed, virtual machine (VM) converter loaded and used, along with a third product called Sphere, which is used to monitor and manage the installed VMs on the Hypervisor. The concept of a data store was introduced where the VMs are either created or copied to. While Hypervisor supports RAID configurations, it should be local due to bandwidth and network issues. To give you a better idea of the possible importance of this technology, I have more than 15 VMs created and stored on a local 2TB hard drive. I have had seven running at the same time without any degradation on the local Windows machine that is running the VSphere interface. All of the horsepower is on the Hypervisor server and, since it really isn’t running Windows itself, it’s pretty quick! The VMs range from my compiled development environment, SCADA and HMI development, graphics and multimedia support and customers’ machines that have been virtualized so that I can test all of my software as if I am at customer sites. I have Windows NT, 95, 98, 2000 and seven as VMs so I can test anything with any operating system. (Windows 8 is on its way.)While this is a boon for developers, any plant, OEM, or integrator can make use of this technology right now. Each VM can be configured to have all standard hardware components that the operating system on the VM can support. It uses the available hardware from the guest, such as a CD-ROM and USB ports. To support my customers, I develop in one VM, copy the final code to a USB stick, connect the stick to the VM that connects to the customer and transfer the code directly. Previously, all applications had to be on the same physical machine. Now various applications can be resident on a server in a nice environment, and you no longer need industrially-hardened hardware out on the floor. A simple interface device like a thin client is all that’s needed. This improves security and maintainability as well as reduces costs. Not bad so far, eh? So imagine five VSphere clients on the floor. The five VMs are on a Hypervisor server. Data logs and databases need to be common, so you can create a sixth VM that would act as the server for all the data. The real issue is access to the outside work and backing up that data. The outside world can access the data just as if it was a physical machine. The ‘server VM’ has its own IP address on the network and can respond to requests as normal. Once you have your server set up with the VMs that you want and need, a new part of the challenge is to back up these VMs so that if the VM gets corrupted due registry issues or common Windows driver issues, you can recover the VM from the backup file created. There are two ways of creating backups for your VMs. The snapshot manager is a manual option that allows you to take and store a snapshot of the VM that you can restore back to if needed. Best practice suggests a scheduled backup. I have used Acronis True Image for local machines for years and now use Acronis vmProtect for VMware Hypervisor. vmProtect installs as a web service, thus uses your default browser as the interface. Once the license and the connection to the Hypervisor are set up, the only thing left to do is create the backup task(s). You can back up the server itself and all of its configurations, or a combination of VMs. It really is as easy as pie. The location of the backup images can be anywhere on the network, so this is where a raid NAS (Network Storage) can come into play. Once you set up the task, you are done. Because Acronis vmProtect interacts with the Hypervisor directly, the VM doesn’t have to be running, or if it is, it will still back up the VM along with any dynamic data that is on the VM. You can also extract individual files from the backup image, replicate the VM or even run the VM from the backup image directly. You can also use Acronis True Image locally on each VM, but with the multi-backup configuration of vmProtect, you can’t beat the convenience and reliability of backing up your VMs. The application of Hypervisor must include recovery. It’s easy, and reliable. Look for applications that can benefit from virtualization and leave the backup to Acronis. In my experience, it is the only way to go. Welcome to 2013! This article originally appeared in the March/April 2013 issue of Manufacturing AUTOMATION.
As I walked into the classroom at Durham College for the first time in 1986, I was terrified but oddly confident; kinda like a push/pull feeling. I was a “professor.” I had worked for Allen Bradley for just less than 10 years, just short of vesting my pension so, yes, you guessed it, I am no spring chicken! But because of the training and guidance I received from the experienced people at the plant in Cambridge, Ont., I have forged an enviable career. Funny how it all started. Back when I was in high school, grade 13 was for the brainers. I fooled myself into thinking that I was one of them, and took three math credits, chemistry, physics, history and English. Believe or not, I achieved very high marks in the first five subjects. The other two… well, not so much. After high school, I went to Ryerson to study controls engineering, before Ryerson became a university. They were the best three years of learning I could ever imagine. I met some lifelong friends, was awarded ‘best all-round technologist’ in fourth semester and I wondered why and how. I pondered that it was because I played in a band from Friday night ‘til Sunday… so my effective study time was short. Back then, Allen Bradley, as a company, really did consider their people very important. They hardly ever fired anyone. They trained their way through the impasse. Novel idea. When I was going to leave the company, the VP of sales met with me and my immediate manager to change my mind. I still wonder what they saw in me and my skills and/or personality, but I have to tell you, it felt pretty good to be “courted.” I feel that the only reason this meeting happened was because of the training and experiences I had from high school to Ryerson, to the AB training program and, of course, my own insatiable thirst to learn. I got involved in everything. I have only missed one ISA technical conference in 23 years. I have presented at most. At one point in time I was receiving 26 publications, and read them all. (Note: there aren’t that many around anymore!) I produced the first ‘learning’ newsletter called The Software User. Steve Rubin of Intellution fame asked me where I made my money. My dumbfounded look said it all. It really isn’t about the money – it is about helping and guiding. The feeling of sharing and guiding is priceless. Our college system is failing us. Our apprentice system is failing us. Our employers are failing us. Hopefully we are not failing ourselves. This article originally appeared in the January/February 2013 issue of Manufacturing AUTOMATION.