UPDATED 12:12 EDT / NOVEMBER 27 2019

Renaud Gaubert NVIDIA and Diane Mueller Red Hat INFRA

How Kubernetes Operators are simplifying life for data scientists

Artificial intelligence and machine learning are critical tools for today’s data scientists to master. Because these new technologies add complexity to what is already a complicated workload, they can take up valuable time and distract focus from the main goal of extracting knowledge from data.

“We’re working towards reducing that complexity and making sure that people that are data scientists, AI, or machine-learning engineers are able to focus on their core business values,” said Renaud Gaubert (pictured, left), lead container and OSS engineer at Nvidia Corp.

Gaubert and Diane Mueller (pictured, right), director of community development at Red Hat Inc., spoke with Stu Miniman (@stu), host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, and guest host John Troyer (@jtroyer), chief reckoner at TechReckoning, during the KubeCon + CloudNativeCon event in San Diego, California. They discussed Red Hat product announcements designed to automate services and applications, simplifying life for data scientists and AI/ML engineers. (* Disclosure below.)

Data scientists don’t care what’s under the hood

“[Data scientists] would like to just open up a JupyterHub notebook, have everything there they need, train their models, have them run, and then after they’re done, they’re done,” Mueller said.

Helping bring that “automagic” to its product line, Red Hat announced an alpha release for OpenShift 4 during KubeCon.

“It’s an all Operators-based deployment of OpenShift,” Mueller stated. “I think the Operator framework really has been the big thing that we’ve been really getting a lot of uptake on. It’s been the new pattern for deploying applications, or services, and getting things beyond just a basic install of a service on OpenShift or any Kubernetes.”

Alongside this release, Red Hat has been populating OperatorHub.io with Kubernetes Operators. This is where upstream projects that have Operators — such as the Nvidia GPU Operator — are being hosted, Mueller explained. Free access to these Operators means that anyone can deploy them, on OpenShift or any Kubernetes solution.

“OperatorHub.io, everything in there runs on any Kubernetes. And … the goal is to be able to take stuff in a hybrid-cloud model [and] be able to run it anywhere you want. So we want people being able to do it, anywhere,” Mueller said.

Operator SDK simplifies AI/ML deployment

The Nvidia GPU Operator is based on Red Hat’s Operator software development kit; it simplifies AI/ML deployments, according to Renaud, who explained that this occurs over four distinct phases. The first phase could be to “install all the components that a data scientist, or generally, a GPU cluster might want to, or need, whether it’s the Nvidia driver, the computer runtime, the Kubernetes device plugin, [or] the mounting points,” he said.

Phase two automates infrastructure build and the ability to update different components.

“Phase three is generally being able to have a life cycle,” Renaud stated. Managing multiple machines means that some are going to get into different states, and some of them are going to fail. The NVIDIA GPU Operator allows recovery from these bad states to good states.

The fourth and final phase is monitoring. “Which is being able to actually give insights to our users,” Renaud said. “So, the Operator SDK has helped us a lot here, just laying out these different steps, and, in a way, it’s done the same thing as what we’re trying to do for our customers, the different data scientists. Which is, basically, get out of our way and allow us to focus on core business values.”

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of the KubeCon + CloudNativeCon event. (* Disclosure: Red Hat Inc. sponsored this segment of theCUBE. Neither Red Hat nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

Since you’re here …

Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!

Support our mission:    >>>>>>  SUBSCRIBE NOW >>>>>>  to our YouTube channel.

… We’d also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.