Around noon on Monday August 14, SIGGRAPH 2018 attendees began streaming into the Vancouver Convention Center. Their first destination was the keynote hall, where Rob Bredow, senior VP and executive creative director of ILM, recounted the making of Solo: a Star Wars Story. An hour later, the crowd made their way across the waterfront promenade to queue up for another keynote, by NVIDIA CEO Jensen Huang.
To remind the audience how far CGI (computer graphics imagery) has come, Huang dug up archival materials from SIGGRAPH 1979.
That year, CGI pioneer Turner Whitted created a short 3D animation sequence titled “The Compleat Angler” to demonstrate his recursive ray-tracing algorithm in a SIGGRAPH presentation. With transparent spheres rotating over a checkered floor, the reflective surfaces and light bounces proved too intense to compute even for the VAX11/780, a minicomputer that cost $120K-$160K at the time (but has been discontinued since).
“At lunch a few months later, a Bell Labs executive asked what it would take to run ray tracing in real time,” Whitted recalled in a blog post he authored. “I proposed a vast array of Cray supercomputers, one per pixel, with a red, green, and blue light bulb on top of each one.”
The Compleat Angler sequence took Whitted 1.2 hours to render at a resolution of 512 x 512, Huang pointed out in his demo. “Basically he [Whitted] was generating 60 pixels per second, instead of 60 frames per second,” said Huang.
NVIDIA’s latest GeForce RTX GPUs can run 4K-resolution HD games at 60 frames per second. And Whitted now works in NVIDIA Research.
The Era of NVIDIA RTX
Combining NVIDIA’s Turing GPU architecture and NVIDIA RTX give you real-time ray-tracing and AI (artificial intelligence), according to the GPU maker.
“The NVIDIA RTX is a platform consisting of architecture, software, developer kits, and libraries,” explained Huang. He proposed that AI-assisted ray-tracing — using pretrained rendering algorithms to accelerate pixel creations in the scene — is more efficient than “ray-tracing at brute force level.”
The Unreal game engine from Epic Games now supports NVIDIA RTX’s technology and also Microsoft’s DirectX Ray tracing (DXR), according to Epic Games.
“The availability of these technologies is making real-time ray tracing a reality,” said Epic Games CTO Kim Libreri. “By making such powerful features available in Unreal Engine 4, we are shaping the next generation of game and movie graphics.”
The implications are not confined to gaming and entertainment. Since game engines are now the path to engineering simulation and training in augmented reality (AR) and virtual reality (VR), it could effect the quality of design visualization and simulation possible in AR-VR applications.
From CAD to Game Engines
The two leading game engines — Unreal and Unity — stood out on the show floor in their overhead banners and large display booths. While courting game developers and filmmakers, they’re also reaching out to the professional engineering crowd. (For more on this topic, read “From Solid Geometry to Responsive AR-VR,” August 2018, DE)
Unreal from Epic Games recently launched the open beta for Unreal Studio, described as “a comprehensive, real-time visualization solution that will save you hours, if not days, in bringing Unreal Engine projects to life.”
“Unreal Studio is designed for people who work with CAD data,” said Ken Pimentel, senior product manager, Epic Games. “It gives them a simple, friction-less workflow.”
At the time of SIGGRAPH, Unreal Studio had 90,000 registered users, according to Pimentel. “It helps that it’s free,” he quipped.
In the future, Unreal Studio will most likely become a subscription-based offering, says Pimentel.
By contrast, Unreal Engine is available as a free download, but professional use in commercial projects requires a percentage of the royalty when the product ships.
Unreal Studio comes with a Substance material library (provided by Allegorithmic) and the Datasmith workflow toolkit for converting CAD data into game engine-ready visuals. The extensive list of formats supported by Datasmith includes CATIA, SolidWorks, Siemens NX, PTC Creo, Autodesk Inventor, and JT.
Support for Phython scripting allows Unreal Studio users to, for example, automatically translate the original CAD materials into Unreal Engine’s Substance equivalents; or remove features smaller than a certain size to speed up the visualization.
Phython scripting is more common among those with visualization and animation background than design engineering. Typical CAD users, for example, may not know Phython scripting; therefore, collaboration with design visualization specialists may be necessary. Another option is to use Blueprints, a visual programming environment in Unreal.
Also included in Unreal Studio are tutorials, set up as complete courses. “It’s very much like an online school,” explained Pimentel. “You register, you go through a course, and it keeps track of your progress.”
Unreal courses are designed for different disciplines, such as architecture, industrial design, and media and entertainment.
Chaos in Cloud
The SIGGRAPH crowd also discovered that Chaos Group’s popular renderer V-Ray is heading to the cloud.
“We want to remove all the complexities involved in rendering something in the cloud,” said Lon Grohs, VP of strategic marketing. “We want to make it a push-button solution from whichever application you happen to be using to run V-Ray.”
V-Ray Cloud, as it’s called, comes with Live View, for you to monitor the rendering in progress, and Smart Sync, which updates the rendering when the source data changes. The actual hardware is from Google, Chaos’s partner in this project.
V-Ray is available inside Autodesk 3ds Max, Maya, form.Z, SketchUp, and Rhino, among other modelers.
At the show, V-Ray also announced V-Ray for Unreal, a tool to bring V-Ray scenes into Unreal game engine. “We can translate your V-Ray materials to the nearest equivalent in Unreal,” said Grohs.
Changing How You Sketch
The rise of tablets prompted many CAD developers to rethink how the sketching environment works inside their modeling programs, but a new challenge may be on the horizon — how to sketch inside AR-VR’s 3D environment.
One firm that believes they have the answer is Gravity Sketch, which developed a drawing program that lets you drawing using AR-VR controllers inside virtual 3D spaces. Using AR, VR, and touch technologies, Gravity Sketch lets users drag and pull on colored lines and curves to create sketches, surfaces, and objects in virtual space.
It offers IGES export to bring the design into CAD programs for further refinement and parametric design. The program works on Vive and Oculus headsets at the present.
Real-Time Raytracing for Professional Workload
NVIDIA CEO Huang got a long, enthusiastic series of applause midway into his keynote, when he revealed the price for GeForce RTX — starting at $499.
“Preorder today,” he urged the crowd — “on shelves everywhere September 20th.”
The GeForce RTX ranges from $499 to $999. While the GeForce lineup serves the gaming and consumer market, the Quadro lineup serves the professional crowd.
As listed in NVIDIA’s blog, the Quadro RTX prices are:
Quadro RTX 8000 with 48GB memory: $10,000 estimated street price (ESP)
Quadro RTX 6000 with 24GB memory: $6,300 ESP
Quadro RTX 5000 with 16GB memory: $2,300 ESP
The top-line Quadro RTX 8000 Dual is priced $20k. The price reflects the twin-GPU setup. The RTX 8000 is in fact two GPUs connected via NVlink, NVIDIA’s high-speed GPU linking technology that allows multiple GPUs to divide up the workload.
“$20K [for Quadro RTX 8000 Dual] gives you twice the ray-tracing performance of a $68K DGX station,” Huang pointed out.
The first DGX — DGX-1 — was unveiled at NVIDIA GTC two years ago in 2017. What seemed like good value two years ago has been superseded by the far more economic RTX this year. That in itself is a sign of the relentless pace at which computing power consumption and creation appear to be multiplying with every new release or hardware and software.