By Eric Starkloff, National Instruments
Engineers and scientists around the globe are building their own measurement applications using a PC, plug-in data acquisition hardware, and application software rather than stand-alone box instruments. Since the early ’90s, when the latter was the norm, a few factors have emerged to make virtual instrumentation (VI) the preferred method for taking measurements.
The first is the compounding performance improvements of processors and hardware. VI has benefited greatly from Moore’s law. With processing power doubling approximately every 18 months, engineers and scientists have rightly come to expect future performance gains. Think back 10 years ago when the 486 DX2 started shipping to customers in volume. It ran at 75MHz and crunched 50 million instructions per second (MIPS). A typical system at the time came with 8MB of RAM and a 160MB hard disk and sold for between $2,200 and $2,500. Today, processors run at 3.6GHz, crunch at greater than 1000 MIPS, and are available on a system with 512MB of RAM and 80GB of hard disk space for about $1,000. Your investment today in creating your own computer-based instrument will payoff in the future as hardware performance improves.
Another telling reason for VIs taking the upper hand is that more engineers and scientists are developing practical computer intelligence. Technological literacy has become fundamental to a person’s ability to navigate through society, and in the manufacturing and process industries, that importance is compounded by the professional demands on each individual. We all have become more technologically savvy over the last 10 years; just visit your local elementary school for proof.
Two years ago, 77 Vanderbilt University engineering students were asked to perform standard experiments with both conventional benchtop and computer-based instruments. The results showed that students agreed with the statement: "Computer-based instruments are user friendly and easier to use." Clearly, perceptions have changed, and overall, greater technical literacy and programming skills are contributing to the adoption of computer-based VI using application software such as NI’s LabVIEW or Microsoft Visual Basic.
On top of this increasing technical savvy is the wide variety of easily accessible information. In 2004 International Data Corp. forecast that there were more than 500 million Internet users worldwide (less than 10 percent of the world population). That’s a dramatic climb since 1998 when some 142 million people used the Internet. It’s likely that the vast majority—possibly well over 85 percent—of engineers and scientists use the Internet. And, on a daily basis, more and more technical content of the "do it yourself" variety appears on the Web, essentially accelerating the use of computer-based measurements. From low-cost optical power measurements to making your own DNA microarray system, using Google to search for a specialized measurement, or searching in conjunction with terms such as "LabVIEW" or "Virtual Instrumentation," will provide thousands of examples and documented computer-based measurement techniques.
Ultimately, today’s computers and software work the way we expect them to. Just 10 years ago, however, it was relatively difficult to install a mouse and software driver and count on it all working. Today, developing a measurement and automation tool with a PC, plug-in DA hardware, and software is low in cost, flexible, and reliable. Computer-based measurements have become the standard for developing design verification, manufacturing test, or process control systems. Virtual instrumentation is now virtually mainstream.
Eric Starkloff has worked at NI for seven years. He has a B.S. in Electrical Engineering from the University of Virginia. You can contact him about this article c/o Desktop Engineering Feedback.