UPDATED 09:00 EDT / JULY 30 2019

AI

At SIGGRAPH, Nvidia showcases extremely realistic computer graphics with ray tracing

Nvidia Corp. is using the annual ACM SIGGRAPH computer graphics technology conference in Los Angeles this week to highlight some of the advances it has made in the field since the launch of its powerful RTX technology last year.

Nvidia RTX is a graphics rendering development platform that’s primarily aimed at enabling real-time ray tracing. The platform runs on the company’s Quadro RTX and Titan RTX chips, which are graphics processing units based on a Turing architecture that use Tensor cores to accelerate ray tracing.

Ray tracing is an advanced technique for rendering computer graphics more realistically. Relying on a machine learning algorithm to trace the path of light sources and shadows, it can simulate how these interact with the virtual objects they fall upon in a computer-generated environment. The end result is far more accurate and lifelike images than traditional computer graphics.

Until Nvidia launched its RTX platform, ray tracing was so costly that only major film studios and other organizations could afford to access the massive computer processing power that’s needed. But Nvidia’s RTX platform has made the technology much more widely accessible.

Nvidia said that since the launch of its RTX platform, ray tracing has emerged as a new industry standard in industries such as architecture, gaming, product design and scientific visualization. According to the company, large software makers have introduced more than 40 applications with RTX technology, which it said will enable tens of millions of people to use ray tracing and AI.

Some of those new applications include Adobe Systems Inc.’s new Substance Painter tool, which enables a feature called “light baking” that pre-calculates how textures will look in certain environments for games and other interactive visualizations, based on the amount of light. The new application improves light baking performance by 192 times compared with standard central processing units, Nvidia said.

Meanwhile, Autodesk Inc., which makes design software for architects, engineers, media professionals and others, is launching a new application called Autodesk Flame that’s able to isolate, extract and modify objects in moving footage, so designers can add realistic visual effects to their content.

To showcase the enormous real-time rendering potential of ray tracing, Nvidia created an interactive 3-D demo of the Apollo moon landings. The astronaut shown can copy a user’s exact movements.

The demo works using a single camera that’s set up to capture people’s movements, with Nvidia’s newly developed “pose interaction technology” used to match those movements in the 3-D environment. And so without wearing any special suits or using any depth sensors, people can walk around and explore the Apollo moon landing site almost as if they were actually there on the lunar surface. They can even do a funky dance if they want, as this video below demonstrates:

“From the moon rocks strewn about to the Apollo 11 lander, the realistic details in the scene are made possible through NVIDIA RTX technology,” Nvidia said. “It creates stunning levels of realism while simulating how the sun’s rays react to the moon’s surface in real time. The interplay of lights and shadows give incredible new perspectives on the moon landing.”

Nvidia also has some new hardware to show off in the shape of a couple of new headsets. These include a new “Prescription AR” headset which is actually not that much more than a pair of glasses in terms of appearances. The device is said to be many times thinner and lighter than traditional augmented reality headsets, yet offers a wider field of view than current-generation devices. As a result, virtual objects appear more naturally, rather than being clustered in the center. The glasses can also be configured for those who use corrective optics so they can see augmented reality scenes as they would with their normal specs, making for AR displays that are more comfortable, practical and socially acceptable, the company said.

The second, “Foveated AR” headset sports a more familiar bulky design, with the main advance being in how it adapts to a user’s gaze in real-time, thanks to its built-in deep learning technology. The headset works by adjusting the image resolution and matching the focal depth to wherever the user is looking, resulting in sharper images and a wider field of view.

It combines two different displays per eye, a high-resolution small field of view displaying images to the portion of the human retina where visual acuity is highest and a low-resolution display for peripheral vision, the company said.

The company also announced Monday the launch of 10 new Nvidia RTX studio laptops and professional-grade mobile workstations from partners including Dell Technologies Inc. and Hewlett-Packard Inc. powered by its latest GeForce RTXTM 2060 and Quadro RTXTM 5000 GPUs.

Image: Nvidia

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU