Evolution of High-Performance Computing

We live in a Golden Age of design engineering on powerful desktop computers.

We live in a Golden Age of design engineering on powerful desktop computers.

By Peter Varhol

 
Evolution of High-Performance Computing
Since its origin, the PC has incorporated an expansion bus that enables engineers to seamlessly combine graphics, networking, storage, and other peripherals.

Engineers always knew personal computers would become powerful enough to do serious design, modeling,  analysis and development work. When Intel introduced the 32-bit, 80386 microprocessor in 1986, engineering software firms such as Autodesk raced to bring their products to PCs through extended DOS or one of the PC-compatible UNIX distributions.

  Expense was a factor: Systems configured with the memory, graphics, processing power and disk storage to support modest engineering applications cost between $10,000 and $20,000—not including the software license. But even that cost was much better than the only slightly more powerful workstations and minicomputers from the likes of Digital, Apollo, Sun and HP. The era of desktop design engineering had begun.

  The early 1990s heralded the origins of inexpensive desktop engineering computing. With the emergence of 32-bit operating systems like Microsoft Windows 95, running on Pentium processors, software had access to multitasking and virtual memory, which made it possible to run applications larger than the available physical memory.

  The primary roadblock to faster software at this time became the bus speeds—the ability to get data from the peripheral to memory, disk to the memory, and from memory to the processor. To maintain PC compatibility, these busses ran at a sedate speed for far longer than was technologically feasible.   Fortunately, advances in bus speeds eventually made it into the standard PC architecture, leading to memory bus speeds of more than 1GHz today.

  Networking was another key innovation that brought desktop computing closer to engineers. Sun Microsystems (acquired by Oracle) was founded on the premise that “the network is the computer,” meaning that network resources made the desktop workstation (or X-Windows terminal) more powerful than the resources in a single box. PCs followed slowly, first with file server networks based primarily on Novell NetWare network software, then using Windows networking.

  A third innovation was cache memory. Cache is used at several different stages of the execution pipeline—on the processor, on the disk controller, and on the graphics controller. Cache is fast memory that holds recently used code and data with the expectation that it will be used again soon. While that principle isn’t universal, it is common enough so that cache greatly speeds up most applications. While the processor is much faster than memory, the bus from main memory to the processor slows it down, so having some memory adjacent to the processor makes it run more efficiently.

 
Evolution of High-Performance Computing
Sixty-four bit processors,  such as this six-core AMD Opteron, bring supercomputer-level performance to desktop engineering computations.

Last but not least, 64-bit processors have all but replaced 32-bit chips in all systems, and have been in engineering workstations for some time. Sixty-four bit processors have the potential to be faster, because they can carry more than one value per clock cycle. More importantly, they can address more memory than their 32-bit counterparts. They are able to work with much larger applications and data sets; 32-bit processors are generally limited to an application and data address space of between 2 and 3GB, an amount increasingly inadequate for engineering applications with large data sets or computations.

  Where We Stand Today
These and other innovations have combined to produce engineering workstations under $5,000 that support full-featured design processes using a variety of software. These systems are using complex CPUs that consist of multiple individual processors, plus additional on-chip caches that provide the ability to keep multiple process threads directly on the chip.

  Today, engineers have as much power at their deskside as they did in the data center five years ago—and on a supercomputer 10 years ago—at a fraction of the cost. Because that power is spread among multiple processors, multiple cores and multiple threads per core,  high-end engineering analysis applications that are multi-threaded run very well.

  For design engineers, there is additional capability in the form of special-purpose peripherals. Plotters and large-format printers make it possible to deliver output that accurately reflects the product being designed. Digitizers enable drawing draftsman style,  or scanning from an existing diagram, and using that as a basis of a new design. In some designs, rapid prototyping inkjet devices are even capable of formulating a prototype from an on-system design.

  The combination of inexpensive computing horsepower, large amounts of memory and storage, lifelike graphics, powerful applications and a growing array of sophisticated peripherals make the personal computing workstation a staple for engineering work. Today’s tools are inexpensive enough so that any engineer can do professional and sophisticated work with a relatively small investment.

  But it will get even better.

  On the Horizon
The most significant trend for future innovation is not increasing CPU horsepower, although that is certainly happening. Instead, it is the ability to leverage distant power with an increasingly diverse collection of devices—netbooks, tablets and even smartphones. Engineers won’t create a design on a phone interface, but they can check on simulation data, kick off new simulations, and show designs to colleagues, among other things.

  The emergence of graphics processing units (GPUs) as more of a general-purpose computational processor also promises to improve the performance of certain types of computations. For engineers requiring supercomputing-class computational power, GPUs such as the NVIDIA Tesla line have the potential to deliver far higher levels of computational power than conventional CPUs. Intel is also pursuing development work on GPU technology, with the intent of building GPU-type computation performance into its industry-standard CPUs.

  Cloud computing will offer engineers more alternatives on how to work. Design software can run in the cloud, making it possible for engineers to access their designs and work anywhere with fast Internet access. They may also be able to work using other devices, such as tablet computers and smartphones, delivering a mobility that hasn’t been available before.

  More from an immediate standpoint, cloud computing will enable engineers to rent time on high-end applications they can’t justify buying licenses to use on their own systems. Because systems in the cloud are often high-end servers, they are likely to offer better performance than the same software running on a desktop.

  Special-purpose peripherals are also emerging to make designs easier to create and more realistic. Tablet computers in particular may be a boon for engineers who do measured drawings. Instead of drawing on a digitizing tablet and having the results appear on the computer monitor, for example, tablets such as the Wacom Cintiq let engineers draw on a tablet-like device monitor, just as they would on a sheet of paper.

  3D mice, made by 3Dconnexion and others, are also changing how engineers interact with their computers. 3D mice provide the ability to manipulate a 3D drawing on the screen through all three axes, letting engineers pan, zoom and rotate the model or camera as if they are holding it in their hands. This approach is about as close as it is possible to get to actually touching the product being designed.

  Only Time Will Tell
We cannot forget the fact that Moore’s Law still holds, after a fashion. The popular interpretation says that processors are doubling in speed every 18 months, and for a long time,  that’s pretty much what happened. It still is, and may continue to do so in the future, providing engineers with ever-faster systems for more comprehensive activities.

  In time, engineers won’t have to offload work to supercomputers or server clusters for high-end analysis or dynamic simulation. The vast majority of work will be done directly on the computer in front of them. For the diminishing amount of work beyond the scope of the desktop, the engineering workstation will serve as a portal to cloud systems that have the software or added horsepower for occasional needs. In effect, most engineers will end up having the equivalent of a supercomputer on their desks.

  Of course, nothing about the future is guaranteed. Engineering computing could be poised for advances in an entirely different direction. But at the very least, computers, peripherals and software will be the engines that drive a continuing revolution in engineering design.

More Info:
3Dconnexion
Intel Corp.
Microsoft
NVIDIA
Wacom


Contributing Editor Peter Varhol covers the HPC and IT beat for DE. His expertise is software development, math systems, and systems management. You can reach him at [email protected].

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Peter Varhol

Contributing Editor Peter Varhol covers the HPC and IT beat for Digital Engineering. His expertise is software development, math systems, and systems management. You can reach him at [email protected].

Follow DE
#5851