

KubeCon has long been the heart of cloud-native innovation, and this year’s gathering is no exception. This week’s event focuses on the intersection of Kubernetes and artificial intelligence.
KubeCon North America continues to be a seminal event, especially for the developer community and open-source innovation, according to John Furrier (pictured, left), co-host of theCUBE, SiliconANGLE Media’s livestreaming studio.
“It’s a generational shift … developers love AI,” he said. “You bring that open AI vibe into open source, where all the action is on these models, and it’s going to be very interesting to see how this community translates into AI. What does AI actually do for infrastructure? And then what are the apps going to be running on? It’s a huge conversation.”
As AI takes center stage, there is a significant debate about whether to keep AI open source or go proprietary. Furrier strongly believes in the power of open source, emphasizing that it “continues to power the innovation.”
Furrier spoke with his co-analysts Rob Strechay (right), Savannah Peterson (second from left) and Dustin Kirkland (second from right) at KubeCon + CloudNativeCon NA, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed AI’s central role in open-source innovation, the sustainability challenges of AI technology, Kubernetes’ evolution to support AI workloads, the vibrant community’s growth and the ongoing conversation about open-source business models. (* Disclosure below.)
The event’s kickoff keynote emphasized AI and sustainability. A critical aspect of the AI conversation is the environmental impact, according to Strechay.
“AI is not exactly the greenest of technologies,” he noted, referencing the considerable resources needed to run large AI models, such as ChatGPT. “And I think they’re leaning in and trying to get more developers into some of the incubation. They talked about Kepler going into incubation. And I think those were the two big themes I heard this morning as well.”
AI was mentioned within the first three minutes of the keynote, underscoring its centrality to the Kubernetes ecosystem, according to Kirkland, who emphasized that balancing AI’s potential with considerations of power consumption, security and developer experience is crucial.
A generational shift was acknowledged by Furrier, as developers are drawn to AI for its ability to enhance productivity. The integration of AI with open-source communities like Kubernetes is anticipated to generate intriguing discussions and innovations.
One particularly insightful observation during the keynote came from Tim Hockin, distinguished engineer at Google, during a panel discussion. Hockin stressed that Kubernetes was not initially designed with AI in mind but is evolving to accommodate AI workloads, a sentiment echoed by Kirkland, who noted Kubernetes’ adaptability to various domains.
“It has been and is being retooled to run large language models, machine learning workloads and AI in general,” Kirkland said. “I think that’ll be an interesting outcome of this conference. Do we emerge with a Kubernetes that’s even better positioned to run the future of AI?”
Another important theme mentioned by theCUBE analysts centered on platform engineering’s role in AI, considering how infrastructure will evolve to incorporate AI more seamlessly. The need for Kubernetes to effectively utilize hardware resources like GPUs for AI processes was discussed as a part of this shift.
“How do you take advantage of Intel, Nvidia, AMD, all of the hardware there, which people who came out of it into platform engineering, they kind of understand that, but they looked at it as the OS taking care of that,” Strechay said. “Now they’re looking at how do we actually go straight from microservice into a GPU, for instance.”
As the conversation turned to the broader implications of Kubernetes and AI, Peterson asked whether the enterprise, and society in general, are on the cusp of an “insane AI universe.” One possibility is that Kubernetes may be experiencing its “Linux moment,” becoming as universally applicable as Linux itself, according to Kirkland.
“Even though Kubernetes was built to run one class of workloads, it’s showing quite a bit of malleability into other domains. I think that’s quite powerful,” he said.
Throughout the discussions, a theme of reclassification emerged, with Furrier mentioning that the “rich are going to get richer,” referring to major cloud and hardware providers. However, there is room for smaller players to carve out their own space in this evolving landscape, he added.
The focus of the keynote wasn’t just on the technology, but also on the community driving it. As Peterson aptly put it, “Open source is nothing without the community of contributors,” highlighting the crucial role of communal engagement.
As both a developer and entrepreneur, Kirkland reflected on the successful business models that have emerged around open source, highlighting the increasing willingness of businesses and venture capitalists to invest in open-source software as they acknowledge the need to fund these initiatives sustainably.
“We’ve seen a number of successful business models enable open source to succeed,” he said.
As KubeCon continues, the discussions are set to delve deeper into how Kubernetes will adapt to the increasing demands of AI workloads. The anticipation for how this will shape the cloud-native landscape is high, and as Peterson hinted, the outcome of this week might just signal the beginning of a new chapter for Kubernetes, AI and the open-source community at large.
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of KubeCon + CloudNativeCon NA:
(* Disclosure: TheCUBE is a paid media partner for KubeCon + CloudNativeCon NA. Neither Red Hat Inc. and CNCF, the main sponsors of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
THANK YOU