At this year’s NVIDIA GPU Technology Conference (NVIDIA GTC, May 8-11, Silicon Valley), a team of Autodesk software architects will discuss what some might view as a futuristic vision of CAD — engineering and design software that can mimic how you think.
AI — or software with AI-like behaviors — is the domain of Autodesk Research. In the quest to find ways to improve the company’s existing technologies, the research division routinely sails into uncharted territories like living organisms, biological systems, and machine intelligence.
Mike Haley, Autodeesk’s senior director of machine intelligence, says, “In standard CAD programs like AutoCAD, Inventor [for mechanical design], and Revit [for architectural design], AI implementation might start with mundane, everyday tasks — proper catalog management, autosuggestion of components, helping you pick the right symbol sets or layer setups, for example.”
In other words, you might not even notice that the machine, or the software, is doing the “thinking” for you in the background.
Your Software is Learning to Think Like You
Haley and his team studied their customers’ workflows, habits, and procedures to understand their daily struggles. “Our customers spent an enormous amount of time on housekeeping, on recording and documenting the design,” he points out.
The first step, Haley suggests, is to apply AI to take over some of the repetitive struggles that have nothing to do with design work. That’s most likely the first incarnation of AI in CAD. But going a step futther, AI can also learn about your personal preferences, from the interface layout you tend to deploy to the type of designs you tend to create, reject, or accept.
“The data is already there. It’s in the parametric design files, in the constraints you apply, in the mesh types, labels, and dimensions you choose, even the evolution of your designs over time,” says Haley. “But historically, the data was just too complex to comprehend. AI can ingest those signals and make sense of them.”
Training Session with GPUs
AI training, or deep learning, is usually a compute-intensive process. With parallel processing horsepower, the GPU can effectively cut down the time needed to train a piece of software algorithm to capture the decision-making logic.
According to NVIDIA, “Early adopters of GPU accelerators for machine learning include many of the largest web and social media companies, along with top-tier research institutions in data science and machine learning. With thousands of computational cores and 10-100x application throughput compared to CPUs alone, GPUs have become the processor of choice for processing big data for data scientists.”
Haley says, “We had to build a pretty complicated deep-learning network to be able to dissect the 3D data, to analyze features and morphology. With NVIDIA’s NVLink technology, the GPUs in the network effectively work together, talk to one another, and share memory. In many cases, we use TensorFlow [an open-source software library for machine intelligence]. We tend to use Amazon-hosted GPU machines. We use Quad-GPU machines, with roughly 13,000 cores. Even with that kind of hardware, some of our large [mathematical models] take up to a week to train.”
Cloud service providers like Amazon Web Services (AWS) offer users the ability to specify a type of machine or cluster they need for a job, use it remotely, and pay for the transaction based on usage. For firms that aren’t willing to purchase the necessary hardware for AI development, on-demand vendors have proven to be a cost-effective alternative.
With the more sophisticated technologies like Autodesk Dreamcatcher (currently not yet a product, still a technology preview), the software can suggest the best shape or geometry based on the designer’s requirements (such as the ability to withstand a certain stress/load threshold or remain within a weigh limit).
“In a way, that’s a much harder problem to solve for the software,” Haley reasons. “With machine learning, you need the GPUs to train the software, but once you’ve trained it, [the algorithm for decision making] can run on a local machine. But in generative design, every time you ask the software for an optimal solution, you need to compute heavily on the GPUs. It’s not possible without an enormous amount of GPUs backing it.”
In the long run, Haley believes AI could automatically validate a design or check it for manufacturability.
“AI gets smarter over time — the more you use it, the smarter it gets,” says Haley. “It watches what you do and is constantly learning from the way you work.”
Autodesk’s session on machine learning is titled S7465: Deep Learning for 3D Design and Making. For more, consult the session details on NVIDIA GTC’s website at here.