Dear Desktop Engineering Reader:
Ten minutes into this on-demand webinar, Simplified HPC Clusters for ANSYS Users, co-produced by IBM and ANSYS, you ll see some interesting survey numbers on how compute capacity and/or turnaround time limit simulation fidelity for most users all the time (34%) or some of the time (57%). You ll also see that respondents cite such issues as IT hardware, expertise, and support that keep them from making the jump from workstations to HPC (high-performance computing) clusters. But the biggest barrier of all that keeps engineering organizations from adopting HPC is not cost (16%) by a long shot. Rather 59% of the respondents say they need evidence that HPC has technical benefits.
Sound like you? K, well, today s Check it Out has lots of metrics on the benefits of HPC for you to consider. It also has lots of details that make fretting over the hardware, expertise, and support far less daunting.
Originally aired live yesterday and now available on-demand, Simplified HPC Clusters for ANSYS Users poses and thoroughly answers questions like: What s in HPC for me? Why transition from workstations to HPC? And how do IBM and ISVs (independent software vendors) such as ANSYS make the switchover as painless as possible?
After the usual setting of the agenda, the webinar jumps into a 10-minute or so overview of what HPC means for ANSYS Fluid and ANSYS Mechanical simulations. The presenter goes right to the heart of the matter with a couple of case studies that are a feast of metrics.
First, it s developing turbochargers for diesel engines using ANSYS CFX fluid dynamics software. Here, HPC produced high-fidelity results 12 times faster than the old process. In hands-on terms, that relates to five full-stage compressor or turbine designs evaluated simultaneously in a few hours. Next up is solder joint failure analysis using ANSYS Mechanical running on a decent-sized, 128-core HPC cluster. Optimizations of complex models — 7.8 MDOF thermal stress and 5.5 MDOF creep strain — took one day rather than the traditional two weeks.
ANSYS then offers some benchmark numbers showing the speedup of applications with smaller (12 to 72 cores) clusters. With all the numbers you get from ANSYS, you pretty much have the technical benefits covered as well as the business proposition.
This, of course, leaves you wondering about IT expertise and time to manage an HPC cluster. That s where IBM Platform Computing comes in with its overview of IBM Application Ready Solutions. It is a simple concept really. IBM works with ISVs like ANSYS and users like you to optimize a scalable cluster for your simulations so that you can get set up and running quickly.
Now, an HPC setup optimized for an application is not like a single workstation configured to do likewise, but that workstation simplicity seems to be IBM s goal for both system users and admins. To begin with, IBM brings clarity to the hardware needed to move to an HPC cluster environment. IBM Application Ready Solutions allow you to select from a variety of hardware platforms, including the new IBM NeXtScale System. Again, lots of metrics are provided.
The complement to the HPC hardware and the key to your working relationship with it is the new release of IBM Platform HPC 4.1. This is the software that runs and manages your workloads and cluster, and tunes it for applications like ANSYS. Here, you get an explanation of how this software integrates all HPC management capabilities in the single product, which simplifies deployment, administration, and end-user interaction.
The final third of the webinar features a testimonial from an HPC solutions provider and a panel of experts for a lively Q&A with attendees that ties everything you ve learned together.
All in all, check out this webinar. If you ve been doing the Hamlet-like to HPC or not to HPC thing, think of it this way: there are tides in the affairs of technology, and HPC is the next wave. Hit today s Check It Out link and watch Simplified HPC Clusters for ANSYS Users. Well worth it.
Thanks, Pal. ” Lockwood
Anthony J. Lockwood
Editor at Large, Desktop Engineering