One of the consequences of multicore computing is the introduction of per-core licensing. This is true especially in the analysis market, where the number of cores you designate to a problem can dramatically shorten the turnaround time or increase the quality of the results. Software makers judge — correctly, I might add — that the benefits are significant enough to entice users to pay more for the right to use additional cores. Hence, the birth of pay-per-core licensing. (This, of course, is just a summary of the complicated issue around core-based licensing, which encompasses many different variations.)
I’ve heard grumblings among FEA and CFD users. They feel the gain comes from the additional hardware they are paying for (by adding more processors to the server or by upgrading their dual-core workstations to quad-core ones). So they ask: Why should we shell out more for using the same software on the more powerful hardware we acquired? The counterargument from software developers is, it takes considerable time and effort to enable parallel processing in the software, so users who want that added boost should pay more.
The compromise, it seems, is to come up with alternative licensing schemes that both parties — users and software developers — find acceptable. Software makers are acutely aware of the resentment generated by the pay-per-core practice. Some are exploring pay-per-usage licensing; others are experimenting with token systems.
Jeff Brennan, Altair Engineering‘s chief marketing officer, reflected, “[per-core] licensing grew over time. It began in the 80s and early 90s when supercomputers came on. There was a need for software developers to create special versions for those exotic hardware … In the 90s, it wasn’t even uncommon for hardware vendors to even compensate software vendors for producing these [software] versions, the associated efforts in quality assurance, testing, maintenance, and etc.”
Brennan acknowledged some users’ dissatisfaction with this licensing model, but pointed out, “There is some effort required in software development to not just architect [the product] for scalability … but [doing so] without sacrificing accuracy.”
In 2007, Altair launched an on-demand licensing model for its PBS Professional suite. It was described by the company as “a single on-demand computing environment” where users “only pay for what they use, where customers can enable their entire infrastructure but only pay for concurrent usage of licenses” and they can “dynamically float [licenses] across enterprise computing resources, even geographically separate systems …”
Brannan said, “With our licensing model, the added cost for multiple cores is by no means linear [ a licensing cost multiplied by the number of cores used]. It decays quite rapidly as you increase the number of cores.” In Altair’s licensing model, Brannan explained, “You draw licenses, or tokens, from a central pool only when you’re using the product. You deposit them back when you’re done.”
In 2003, Altair Engineering acquired Veridian’s PBS technology (which stands for portable batch system). The product lives on as Altair’s PBS Works, a job scheduling system for submitting, tracking, and monitoring computing tasks. The company plans to launch a service called HyperWorks On Demand, which lets its users remotely borrow Altair’s hardware resources (in other words, additional computing cores, available via cloud-hosted setup) for jobs that demand more than what’s available internally.
For more on Brennan’s thoughts and on Altiar’s licensing model, listen to the complete podcast below: