By Peter Varhol
Engineers have constantly been required to learn about computers, networks, and technical system configuration in order to apply those computers in their jobs. Years ago, in an era of analysis through batch computer codes (usually in Fortran), you simply submitted your deck to the computer operator and came back the next day to get a printout of the results.
You did have to know something about Fortran programming, but if you were using a commercial deck, or even a deck that was written by someone else in the company, that knowledge didn’t have to extend beyond how to enter and format the data via the keypunch.
Today, with the power of personal workstations or even supercomputers, you can accomplish much more, and in ever shorter periods of time. These systems are far more interactive and visual, letting engineers use their creative abilities along with their analytical abilities, designing better products more quickly.
|The more you know about the computers you use, the more productive you’ll be.|
However, as is always the case in computing, there is a tradeoff. Because so much is under the control of the engineer, it’s necessary to understand that capability to be able to apply it correctly. So it becomes important to know what multiple cores are and how they can help you. IT concepts such as Clusters, multigigabit Fibre Channel networks, and GPUs also help engineers in running applications, analyzing data, and executing simulations.
Of course, that doesn’t necessarily mean you’re the one who has to set it up and get it to work. Most companies have at least small IT teams who are responsible for setting up networks and systems, and installing software on those systems. For serious engineering work using large systems and sophisticated software, a skilled and dedicated IT team will support the design effort.
So what is different for the engineer? First, you have to know what kind of system to ask for to best do your job. Is it a proprietary Unix workstation, a multicore PC, or a multicore GPU system? Do you need a single workstation, a cluster, or networked computers with the ability to draw upon underutilized processor cycles on the network? You are often given the authority to recommend a specific configuration; do you know enough to do so? In many cases, your design software can be optimized to work in a specific configuration; you have to be able to get as close to the best configuration — processor, video card, and so on — as possible.
Once you have the computer, operating system, and network, your work isn’t done. Most personal systems have to be kept in tune, with periodic updates of antivirus software, disk defragmentation, and performance scans. And when things go wrong, you are likely to have to do some of your own troubleshooting, at least for the sake of expediency.
You also have to know how to use the computer to get the best out of it with the software you’re using. This doesn’t matter nearly as much to the typical user of business applications, who runs their PC at an average CPU utilization of around 10 percent. But when engineering design, analysis, and simulation software use up large and sustained chunks of multicore CPUs and churn the disk incessantly, you can’t afford to waste time with a poorly configured machine.
You might consider IT tasks such as keeping your computer in good running order to be a distraction from your real job. In reality, they are essential to the efficient use of your time. You may no longer have to understand Fortran codes, but you’ll do yourself a favor if you take the time and effort learn about computer architectures, operating systems, and computer maintenance.
Contributing Editor Peter Varhol has been involved with software development and systems management for many years. Send comments about this column to DE-Editors@deskeng.com.