![KubeCon + CloudNativeCon NA 2024 GKE feature image](https://d15shllkswkct0.cloudfront.net/wp-content/blogs.dir/1/files/2024/12/KubeCon-CloudNativeCon-NA-2024-feature-image-1.png)
![KubeCon + CloudNativeCon NA 2024 GKE feature image](https://d15shllkswkct0.cloudfront.net/wp-content/blogs.dir/1/files/2024/12/KubeCon-CloudNativeCon-NA-2024-feature-image-1.png)
Over the past decade, Kubernetes has evolved from a niche container orchestration tool into a foundational technology, powering cloud-native applications across industries. As Kubernetes enters its next era, its role in enabling scalable, artificial intelligence-driven innovation appears poised to continue, including with Google Kubernetes Engine. also known as GKE.
Since its inception, Kubernetes has grown to include more than 88,000 contributors across 8,000 companies across the cloud-native ecosystem. That has led some to wonder whether Kubernetes is having its Linux moment. As the originator of Kubernetes, Google LLC holds a unique position in shaping its development and ecosystem. The technology is now embedded deeply into Google Cloud’s services, but it wasn’t always a clear path for the company, according to Craig McLuckie, co-founder of Kubernetes and senior product manager at Google.
“What started as an internal summer conversation has evolved into a global movement,” McLuckie wrote in a blog post. “Kubernetes is now deployed in thousands of organizations and is supported by over 830 contributors that have collectively put in 237 person-years of coding effort to date — velocity that even our wildest goals didn’t anticipate.”
Though there’s been more than a decade of growth, much more work remains to be done. As the community marks a major milestone, it’s worth exploring the evolution of GKE, its enterprise appeal and how Google seeks to redefine Kubernetes for the future with AI-first, developer-centric enhancements.
This feature is part of SiliconANGLE Media’s exploration of the evolution of cloud-native computing, open-source software and the future of Kubernetes. Be sure to watch theCUBE’s analyst-led coverage of KubeCon + CloudNativeCon NA. (* Disclosure below.)
With 11 years in the rear window, it’s now a new landscape. Most organizations are using Kubernetes in some way, shape or form, according to theCUBE Research Managing Director Rob Strechay.
“The original creator, Google, is looking to reinvent how they provide Kubernetes to organizations with the improvements in Google Kubernetes Engine and the addition of Google Cloud Run,” he said. “They’re aiming to further simplify what is still a very complicated set of open-source packages.”
Central to Google Cloud’s developer appeal is GKE. Launched in 2015, GKE has evolved into a cornerstone of Google’s cloud strategy, intending to enable enterprises to easily deploy, scale and manage containerized applications. In November, Google Cloud announced it had upgraded its GKE capacity, supporting 65,000-node clusters, up from support for 15,000-node clusters. The industry requires such upgrades given the growth of generative AI, according to Gari Singh, product manager, Google Cloud, at Google.
“People need these massive clusters. Our internal stuff needs it and our top customers need it. So yeah, the team did a lot of work,” he said during KubeCon + CloudNativeCon NA.
The jump is possible due to Spanner and other technologies. The underlying goal is to solve problems before they emerge and to provide room for innovation, according to Bobby Allen, cloud therapist at Google.
“If you had an environment where we took the barriers off, we took the limits off, what could you do? That’s what we’re trying to do,” he said. “Before you get there, we want to go ahead and break through that barrier, so you can just be unleashed.”
Unlike its rivals, Google Cloud benefits from a unique advantage: It created Kubernetes. That has allowed Google to integrate Kubernetes into its platform through services such as GKE and Google Cloud Run, which is a managed service that allows developers to run containerized applications without managing the underlying infrastructure. It scales applications based on demand, which enables developers to deploy code quickly while focusing on building features rather than handling server maintenance.
For developers and IT professionals, Google Cloud’s appeal lies in its focus on scalable cloud-native tools designed to streamline application deployment and management. Services such as GKE and Cloud Run allow teams to build and scale applications with minimal operational overhead.
Looking toward the future, there’s still much to do. But looking ahead to the next KubeCon + CloudNativeCon, the goal will be to provide on-ramps to people who were not already in the community for years already, according to Allen.
“I believe everybody wants the blessing, nobody wants the assignment,” he said. “We all want the benefits of Kubernetes. What Google’s trying to do is we’re giving the builders the building blocks before they need them so we can take off the limits, so you can do something amazing.”
Support for 65,000-node clusters, of course, is intended to provide the size and compute power needed to handle the most complex and resource-hungry AI workloads out there. Large-scale large language models keep getting bigger from companies around the world and require very large clusters to operate efficiently, Drew Bradstock, senior product director for Kubernetes and serverless at Google Cloud, told SiliconANGLE in an exclusive interview in November.
“It’s not just they require large clusters. They require clusters that are reliable, scalable can handle the challenges these large LLM training workloads actually encounter,” he explained.
Google customers have been taking advantage of GKE’s cluster capabilities. That includes AI model developers such as Anthropic PBC.
“GKE’s new support for larger clusters provides the scale we need to accelerate our pace of AI innovation,” said James Bradbury, head of compute at Anthropic, in November in an exclusive interview with SiliconANGLE.
There has been a 900% increase in the use of TPUs and graphical processing units on Google Kubernetes Engine, above a substantive number in use to begin with, according to Bradstock. That’s being driven by the rapid growth of AI.
“With AI accounting for the majority of Kubernetes Engine’s usage going forward,” he said.
Over the past 11 years, Kubernetes has grown from a Google-led experiment into an industry-defining platform. It’s driven enterprise adoption through services such as GKE and Cloud Run. As businesses increasingly rely on scalable, AI-powered applications, Kubernetes’ future hinges on its ability to evolve further. As Kubernetes enters its second decade, many see an expanded role in AI and cloud-native applications.
“I really think if we take AI and security and put those two together and really say that, we’ve got secured toolchains, supply chains, models, the whole thing, we can deploy that at the edge in the backend infrastructure and know that our precious data that’s being used to train those models is secured,” said industry analyst Dustin Kirkland at KubeCon + CloudNativeCon Europe. “I really hope that’s mostly a solved problem.”
For Google, the challenge is clear. As LLMs increase in size, the need for computational power will only intensify, and training large models on modern accelerators already requires clusters that exceed 10,000 nodes.
“In addition, thanks to a major overhaul of the GKE infrastructure that manages the Kubernetes control plane, GKE now scales significantly faster, meeting the demands of your deployments with fewer delays,” Google officials wrote in a recent blog. “This enhanced cluster control plane delivers multiple benefits, including the ability to run high-volume operations with exceptional consistency.”
The control plane now automatically adjusts to these operations, according to the company. It is also expected to maintain predictable operational latencies.
“This is particularly important for large and dynamic applications such as SaaS, disaster recovery and fallback, batch deployments and testing environments, especially during periods of high churn,” the company said.
Kubernetes’ evolution from a Google side project to a critical enterprise technology highlights its growing influence on cloud-based computing. With Google Cloud expanding GKE’s capabilities to support AI-driven workloads and large-scale applications, Kubernetes appears positioned to remain foundational to modern IT infrastructure.
(* Disclosure: TheCUBE is a paid media partner for the KubeCon + CloudNativeCon NA. Neither Red Hat Inc., the headline sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
THANK YOU