Cloud-native’s AI conundrum: Can Kubernetes meet the AI challenge?
While Kubernetes boasts widespread adoption, its alignment with the demands of artificial intelligence workloads remains a hot topic.
This week’s KubeCon + CloudNativeCon NA event, a flagship conference of the Cloud Native Computing Foundation, has been ripe with debates and predictions, setting the stage for what’s to come at the intersection of AI and cloud-native ecosystems.
“[CNCF] is trying to jump on the bandwagon of AI, but they’re a little late to the game,” said Andy Thurai (pictured, second from right), vice president and principal analyst at Constellation Research, Inc.
While the CNCF has successfully harnessed a vast developer base to propel projects such as Kubernetes to the top, its response to the burgeoning AI wave has been somewhat delayed, potentially missing the first movers’ advantage in this technological revolution, according to Thurai. Despite the success of projects like Kubernetes, the CNCF must catch up, especially as other open-source projects from Google LLC, Microsoft Corp., Amazon Web Services Inc. and Databricks Inc. surge ahead.
Thurai spoke with industry analysts John Furrier (left), Savannah Peterson (second from left) and Dustin Kirkland (right) at KubeCon + CloudNativeCon NA, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the CNCF’s engagement with AI integration into cloud-native systems and the need for simplifying Kubernetes to better support AI and machine learning workloads. (* Disclosure below.)
Kubernetes: AI integration and the complexity challenge
Skepticism among practitioners about AI’s readiness for production is holding back its integration into cloud-native infrastructure, according to Furrier. The hesitation isn’t without reason.
“Generative AI is not yet baked,” he said. “Skeptical practitioners are more AI aware, less AI in production, mainly because they don’t see a path.”
Because Kubernetes was not initially built for AI, it’s at a disadvantage for pivoting or expanding to accommodate the massive hardware, network and storage requirements of AI and ML workloads, according to Kirkland. Reducing complexity in Kubernetes to make it more accessible for app developers is critical.
“Can Kubernetes pivot into that direction? Or maybe just simply not pivot, but expand into that direction?” Kirkland asked. “Yes, absolutely. Kubernetes is a great abstraction of massive amounts of hardware, network compute and storage … all of which the biggest AI/ML workloads need.”
A keynote address at KubeCon by Tim Hockin, distinguished software engineer at Google Cloud, shed light on this issue, suggesting that the next trillion core hours of Kubernetes usage will likely be driven by AI/ML, although it requires a significant reduction in complexity to be truly effective for app development on a global scale.
The path forward: Observations and predictions
The future of AI and its integration into cloud-native ecosystems is top of mind at KubeCon. While there is massive potential for AI to improve productivity and drive innovation, there are significant challenges integrating complex AI workloads, such as large language models, into current cloud-native structures, according to the analysts.
It’s critical to leverage the cloud-native community’s strength — its developer base — to engage more directly with AI through initiatives such as MLOps and projects like Kubeflow, according to Thurai.
“You have the massive developer community who can write anything,” he said, encouraging the CNCF to embrace this potential. “You have problems to solve. Engage them, bring them in, bring them on board.”
The analyst panel also discussed data management. Not all data is valuable, and some should be processed and discarded to avoid excessive hoarding, according to Kirkland a view he admits is controversial.
“The amount of money and time and energy that goes into data storage,” he said. “There’s a lot of data out there that is stored because we’ve been trained to … just store everything. We might have a use for it sometime in the future. I don’t know about that, especially at the edge.”
This viewpoint sparked a broader discussion on the balance between storing data for potential use and the practicalities of data management in the age of AI.
It’s clear that while the CNCF may be playing catch-up in the AI domain, there is a recognition of the need to adapt and evolve. As companies continue to integrate AI into their operations and offerings, the cloud-native community must consider how it can support this shift and what role Kubernetes and related technologies will play in the future AI landscape.
The AI train may be moving fast, but there are actionable steps that can be taken to ensure the CNCF and the broader cloud-native community do not get left behind, according to the analysts.
“Cloud-native computing is about the infrastructure underneath whatever is going to sit on top. And maybe historically that’s been SaaS software and web apps,” Kirkland said. “Next, it’s AI/ML or Web3 or whatever it might be. I think we’re looking for just a general purpose compute infrastructure. If AI/ML drives that adoption, great. If it’s something else … so be it.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of KubeCon + CloudNativeCon NA:
(* Disclosure: TheCUBE is a paid media partner for KubeCon + CloudNativeCon NA. Neither Red Hat Inc. and CNCF, the main sponsors of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Photo: SiliconANGLE
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU