UPDATED 13:00 EDT / FEBRUARY 07 2020

AI

Visual effects powered by Nvidia set for Oscars recognition

Nvidia Corp.’s artificial intelligence technology is shaping up to be Hollywood’s next big star, with two of the biggest movies from last year nominated for Oscars in the 2020 Academy Awards’ visual effects category this weekend.

Netflix Inc.’s “The Irishman” and Marvel Studios LLC’s “Avengers: Endgame” relied extensively on AI technology to enhance the stunning visual effects seen on screen. And in both cases, the producers used AI platforms powered by Nvidia Quadro RTX graphics processing units, effectively shaving years off veteran actors Robert DeNiro, Al Pacino and Joe Pesci in the former and creating an ultrarealistic looking fictional supervillain Thanos in the latter.

“The Irishman” director Martin Scorsese raised a few eyebrows when he announced the aging DeNiro, Pacino and Pesci would star as the lead actors in his tale of mob hitman Frank Sheeran’s life and crimes. After all, in their 70s they aren’t exactly sprightly anymore, and the characters they portrayed were notably a lot younger during the 1960s and 1970s, when most of the movie is set.

Scorsese knew he needed to make the trio look much younger than they actually are, but he was less than pleased with the efforts of his makeup department to transform them back into their early 30s. He was also against using the typical motion capture markers or other intrusive equipment that gets in the way of raw performances during filming.

Instead he turned to the visual effects studio Industrial Light & Magic and its computer-generated imagery chops to turn back the clock.

ILM rejuvenated the aging actors with a new three-camera rig that was used to capture their performances on the set. One was a regular camera, while the others used infrared to record 3D geometry and textures. Then, ILM’s technicians used special AI-based software called FaceFinder to sift through thousands of images of the actors’ from their previous films.

The tool, Nvidia explained, located frames that matched the camera angle, framing, lighting and expression of the scene being rendered, so ILM artists got a relevant reference to compare against every frame in the shot. Those visual references were used to refine digital doubles created for each actor, so they could be transformed into the right age for each scene of the film.

In an interview with SiliconANGLE, Richard Kerris, general manager of Nvidia’s Media and Entertainment Group, predicted that AI would become more commonplace in movie production as it solves two critical problem: making actors look younger, and creating and integrating lighting from different scenes. For “The Irishman,” he said, they went through 40 years of films of the actors and applied that to the CG characters.

As for “Avengers: Endgame,” Marvel used AI technology in a slightly different way, capturing the performances of actor Josh Brolin and using the footage to animate a digital version of Thanos, the main villain in the movie.

Visual effects studio Digital Domain used a machine learning system it developed called Masquerade to take numerous low resolution scans of Brolin’s acting and especially his facial movements, before transferring those expressions to a high-resolution mesh of Thanos’ face.

“The technology saves time for VFX artists who would otherwise have to painstakingly animate the subtle facial movements manually to generate a realistic, emoting digital human,” Nvidia said.

“Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using Nvidia GPU technology,” said Darren Hendler, head of Digital Humans at Digital Domain. “We now use Nvidia RTX technology to drive all of our real-time ray-traced digital human projects.”

Photo: karangan kusip/Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.