UPDATED 14:22 EDT / APRIL 24 2025

Sudeep Goswami, chief executive officer of Traefik Labs, talks with SiliconANGLE and theCUBE’s Paul Nashawaty about AI gateways – AppDevANGLE podcast 2025 AI

Why AI gateways are emerging as cloud-native’s next battleground

There’s a new front opening in the race to build intelligent applications: the intersection of artificial intelligence, application programming interfaces and infrastructure. At KubeCon + CloudNativeCon Europe 2025 in London, the surge of interest in AI wasn’t just about model training or graphics processing unit availability. It was about what happens after you’ve integrated AI, after you’ve deployed the model, and after inferencing becomes your new bottleneck, sparking demand for a new control point: AI gateways.

In the latest episode of theCUBE Research’s AppDevANGLE podcastSudeep Goswami, chief executive officer of Traefik Labs, joins SiliconANGLE and theCUBE’s Paul Nashawaty to talk through this evolution, not just as a vendor in the space, but as someone watching patterns emerge across dozens of partners and customers. What he describes is less about the next-gen model and more about the invisible plumbing that connects those models to real-world use.

“If you don’t put guardrails in place,” Goswami said, “bad things can happen very fast.”

From bursty growth to bounded control

Organizations are now building AI applications at an unprecedented pace. TheCUBE Research data shows that AI has moved from pilot to production in record time: Just 18% of production apps used AI a year ago. Now, that number has jumped to over 54%.

But with velocity comes chaos.

Developers are deploying more APIs to connect to more models, often across a hybrid landscape of SaaS platforms, edge environments and multicloud clusters. This explosion is fueling a quiet but critical problem: How do we manage, authenticate and observe these API-fueled, AI-driven workloads?

The answer is emerging in the form of AI gateways and next-generation infrastructure layers that serve as the connective tissue between microservices, models and the people who run them, according to Goswami.

The rise of AI gateways

AI gateways have long been a staple of cloud-native architecture, sitting in the background to help route, manage and secure service-to-service communication. But in an AI-first world, the responsibilities are evolving.

Today’s AI gateways must:

  • Handle semantic caching to optimize inference workloads.
  • Enforce data privacy and compliance by governing model inputs and outputs.
  • Integrate with API management tools to prevent shadow AI.
  • Support inferencing across central processing units, GPUs, and neural processing units or language processing unit-powered edge devices.

“The more AI you deploy, the more APIs you create,” Goswami said. “And that creates an API management problem.”

Better together: Why partnerships matter

One of the most striking elements of Goswami’s approach is his emphasis on partnerships. Traefik Labs recently announced new reference architectures and integrations with companies such as Nutanix Inc., Oracle Corp., Azure, and Akamai Technologies Inc. Each is designed to simplify a different piece of the AI delivery puzzle, from secure edge deployment to container-native API workflows.

These aren’t theoretical architectures. They’re prescriptive playbooks built to help developers and architects confidently move from test to production. Goswami called this a “better together” mindset, acknowledging that no vendor operates in a vacuum.

AI takes the wheel, but developers still drive

The conversation wasn’t all infrastructure. It also touched on the cultural shift AI is forcing within development teams. Some companies already use AI to generate code, but Goswami and Nashawaty pushed back on the notion that AI replaces developers.

“Each of us has to elevate,” Goswami said. “If AI does three of my ten tasks, what can I now do with the rest?”

In other words, AI isn’t the end of development, but rather an accelerant. Accountability still matters, arguably more than ever. Goswami shared stories of teams blaming AI-generated bugs on the model, forgetting that humans still have responsibility for quality, governance and reliability.

Looking ahead

As models move from centralized clouds to distributed edge locations, the topology of intelligent applications is shifting. Inference doesn’t need to run in the same data center where it was trained. In fact, for performance, compliance and cost reasons, it probably shouldn’t.

That’s why Traefik Labs is leaning into open-source tooling and small-model inferencing at the edge. It’s also why Goswami is bullish on technologies such as WebAssembly, which offer portable, lightweight runtimes that don’t require heavy infrastructure to operate.

“WebAssembly and AI are becoming the power couple of modern app delivery,” he said.

The bottom line

There’s a reason infrastructure is all the buzz this year: AI is here, and it’s not slowing down. However, to harness it, organizations need more than models. They need maturity in their pipelines, control in their gateways and visibility in their APIs.

In that world, AI gateways aren’t just a feature; they’re a foundation.

Here’s the complete conversation with theCUBE Research’s Paul Nashawaty and Sudeep Goswami, part of theCUBE Research’s AppDevANGLE podcast series:

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU