If Autodesk’s proposition to engineers is digital prototyping, Microsoft‘s latest pitch is technical computing. With the launch of its Modeling the World portal, Microsoft is pitching simulation and analysis driven by high-performance computing (HPC) as a way to better understand the complex biomechanical, electromechanical, financial, genomic, and climate systems governing the way we live, work, and play.
The cyclical computing horsepower increases in hardware predicted by Moore’s Law are coming to an end, said Bill Hilf, Microsoft’s general manager of technical computing. “The free lunch — where, every 18 months, we got faster machines because we added more transistors to the microprocessor — because of physics that is no longer possible. That’s why we have multicore systems — more chips on a processor.”
But even multicore systems will have a hard time keeping up what The Economist calls “The Data Deluge” (February 27- March 5, 2010). Hilf noted, “We’re outpacing our ability to store data. We’re actually creating more bytes than we have the capacity to store … Many analysts predict that over the next five years, we’ll produce more data than we have ever produced in the history of human kind” (“Pushing through the Inflection Point with Technical Computing,” keynote address at High Performance Computing Conference, September 20, 2010).
This, Microsoft believes, makes the rise of HPC inevitable. The increase of data makes it possible to incorporate more micro and macro factors into our simulation and analysis. By the same token, it’ll also demand greater computing power, more than what single machines can provide. So parallel processing — dividing up the problem into smaller chunks and solving it on computing clusters — is the way of the future.
For more, listen to my interview with Bill Hilf and read my upcoming article “Modeling the World in Parallelism” in November issue of DE.