Democratizing AI: How Red Hat and Dell are making AI accessible to all
The Red Hat Inc. and Dell Technologies Inc. partnership continues to thrive after a quarter-century of collaboration.
The two companies took center stage at KubeCon NA in Chicago to share their journey through the ever-shifting landscapes of Linux, Kubernetes and OpenShift.
“Linux has made its place in the marketplace, and Kubernetes and OpenShift are really powering the next generation of application deployments that we’re seeing out there,” said Ian Pilcher (pictured, right), senior principal product manager at Red Hat. “It made sense for Red Hat to partner with someone who can handle the hardware, the infrastructure layer for sure. Who better than Dell, with whom we have that existing relationship.”
Pilcher and Michael Wells Jr. (left), engineering technologist at Dell, spoke with theCUBE industry analysts Rob Strechay and Savannah Peterson at KubeCon + CloudNativeCon NA, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed how their partnership is revolutionizing cloud-native computing, democratizing artificial intelligence and reshaping the future of computing itself. (* Disclosure below.)
A quarter century of collaboration
The enduring partnership between Red Hat and Dell has evolved to address the changing landscape of the tech industry. As Linux established itself in the market, Kubernetes and OpenShift emerged as the driving forces behind the next generation of application deployments. Red Hat and Dell recognized the need to integrate hardware and infrastructure seamlessly into this evolving ecosystem, providing customers with a comprehensive and efficient solution, according to Wells.
“One of the things that we just announced was our new Dell-validated design for the OpenShift AI on top of the APEX-style platform for Red Hat OpenShift,” he said. “We have a sample application, which is actually doing generative AI on top of this platform and showing customers how they can get up and running with these types of capabilities without having to go through the process of building and training their own model and all of the work that’s involved in doing that.”
The recent launch of the Dell APEX Cloud Platform for Red Hat OpenShift was a game-changer in simplifying Kubernetes deployment on bare metal. This innovation allows users to start building containers within seconds, eliminating the complexity often associated with Kubernetes adoption.
Kubernetes, with its myriad moving parts, can be challenging for newcomers. However, the Red Hat-Dell collaboration automates many aspects of Kubernetes deployment, making it accessible to organizations of all sizes, according to Wells.
“It’s a lot of moving parts. It’s getting all of that together. It takes time,” he said. “What we’ve been able to do is, working with Red Hat, to design a process where we can automate so much of that for you so that we can get everything up and running, get you into an environment where you can start deploying applications … the same tool that you’re using to manage the cluster and the applications on the cluster, you’re now using that same tooling and processes to manage the infrastructure underneath.”
Democratizing AI and cloud services
The significance of Dell-validated design for OpenShift lies in its ability to democratize AI. By enabling organizations to leverage AI on-premises, they can work with any proprietary data without moving it elsewhere. This flexibility allows organizations to run models on-prem, at the edge or in public clouds, providing unparalleled freedom and control, according to Pilcher.
“I’m truly excited about the potential that the APEX Cloud Platform has to accelerate and kind of democratize the access to Kubernetes/OpenShift for smaller organizations,” he said. “If we can deploy that now in a matter of hours rather than days or weeks, that’s a tremendous accelerator for them.”
Platform engineering was also brought up, with Pilcher likening it to a dinosaur in danger of extinction. However, by fostering collaboration between development and operations teams and enabling them to work with the same tools, platform engineering becomes a vital bridge between hardware and software layers.
It is also important to bring AI to where the data resides and that a significant portion of data remains on-prem, according to Pilcher. The ability to run AI workloads on-prem and at the edge ensures that an organization can make the most of its data without being tied to specific cloud providers.
In a world where powerful computers are becoming more accessible, on-prem and local computing may see a resurgence. This trend aligns with the growing focus on sustainability and the desire to reduce data movement and latency.
“You see a lot of organizations or a lot of countries that have started implementing laws about where data about their citizens has to reside,” Wells said. “The hyperscalers do a great job of building massive data centers, but they can’t just stand up a new region in every sovereign country. You have to be able to make things accessible to your users.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of KubeCon + CloudNativeCon NA:
(* Disclosure: TheCUBE is a paid media partner for KubeCon + CloudNativeCon NA. Neither Red Hat Inc. and CNCF, the main sponsors of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.