By Peter Varhol
In IT, virtualization has been a hot technology for several years. It refers to the ability to capture an image of an operating system and its applications, and to load that image on demand onto a computer for use. The computer can be a server, supporting a server-based application such as business management software, or it can be a desktop system, supporting individual users. In the latter case, the computer image is stored on a server, and downloaded when a user turns on the computer.
Think of the computer as an empty shell. It has one or more fast processors, memory, and storage (or access to networked storage), but no operating system or software. The operating system and applications are stored in a single file, and can be loaded onto that empty shell on demand and run, just as though it were always installed and available.
On the server, virtualization takes advantage of slow periods of application use to consolidate applications on fewer servers, reducing cost, IT management effort, and electricity use. On the desktop, virtualization can save on both hardware- and desktop-support costs.
Mission-critical engineering applications are likely to be based on desktops rather than servers, though such applications may store their data on a server relational database system that can be virtualized. Engineering applications are more likely to be hosted on the desktop. In desktop virtualization, the entire desktop image is stored on the server and downloaded via a fast Ethernet connection when you turn on your computer. With a gigabit-class network, loading and running the image may even take less time than launching the operating system itself. At the end of the day, the current version of the image is saved back to the server, and the desktop system once again becomes an empty shell.
Given this approach to systems management, there may be value to engineers in desktop virtualization. One of the biggest issues any computer user faces is ensuring the correct software versions. This includes both application software as well as device drivers. A virtualized desktop infrastructure enables IT to take on direct responsibility for ensuring that applications and drivers are up to date, relieving engineers of that task.
It also means that you won’t have to wait for desktop support when something goes wrong. Most problems can be fixed from a centralized IT location, often before the user is even aware of the problem. This kind of proactive effort helps engineers by not interrupting their work even as problems arise on their systems.
Still, many engineers, including myself, are used to having all of our applications on our desktop, and customizing that desktop with additional applications, utilities, and appearance so that it is both uniquely ours and the most productive environment possible. Your ability to do that under a virtualized environment is dependent on how much control IT wants to provide to individual desktop users.
If you typically use a laptop as your engineering workstation, and take it with you to work at home or during travel, desktop virtualization doesn’t make any sense. You need your operating system and applications on the local system at all times, because there won’t be a fast Ethernet connection to your storage from outside the office.
Overall, if design engineers work in an office environment where their applications are standardized, and everyone is doing similar work, desktop virtualization may make sense. But most engineers need a customized environment to make them most productive, and desktop virtualization isn’t the best way to get there.
Contributing Editor Peter Varhol covers the HPC and IT beat for DE. His expertise is software development, math systems, and systems management. You can reach him at DE-Editors@deskeng.com.