Kong’s API management scales AI capabilities across enterprise infrastructure
API management is becoming a cornerstone of digital transformation as artificial intelligence and APIs increasingly work together to reshape enterprise infrastructure.
With businesses relying more on AI to deliver personalized user experiences, the need for advanced systems to manage and optimize API traffic has never been greater. By integrating AI models with traditional API frameworks, organizations can now process data faster, enhance security measures and streamline governance processes — all while providing developers with the tools to build cutting-edge applications that meet the demands of a rapidly evolving digital landscape.
“It is very important to actually understand what the AI traffic does,” said Marco Palladino, co-founder and chief technology officer of Kong Inc. “If we do understand what is being requested through AI semantically, so [that] we understand the meaning of the prompts, we understand how similar different prompts are with each other. Now, we can implement intelligent capabilities for accelerating and securing and routing, and more, of these AI traffic.”
Palladino spoke with John Furrier, executive analyst at theCUBE Research, during a CUBE Conversation. They discussed how Kong is advancing API management by integrating AI, using semantic intelligence and tools such as semantic caching to enhance data processing, security and infrastructure management, enabling businesses to efficiently scale AI capabilities. (* Disclosure below.)
API management and developer tools
Kong is pushing the boundaries of API management by integrating advanced AI infrastructure into its platforms. A key innovation is semantic intelligence, which enhances the ability to understand AI traffic and user intent. With features such as semantic caching, Kong enables AI systems to process data up to 20 times faster while maintaining strong security and compliance protocols, ensuring businesses can rely on AI-powered processes without compromising safety, according to Palladino.
“This year at API Summit, we have announced the introduction of semantic intelligence into modern AI infrastructure,” he said. “That really helps us deliver some advanced capabilities when it comes to AI that ultimately will enable our customers to build faster, better AI experiences to create better products for their customers and their users.”
As AI becomes central to data access and services, Kong has established itself as a leader in helping organizations adapt their infrastructure. Kong’s evolving API management platforms not only handle traditional API traffic but also manage the complexities of AI workloads. This allows companies to scale AI capabilities while maintaining high levels of security, governance and operational efficiency.
“We need to give our developers the ability to be successful, the chance to be successful so they can really focus on the end user AI powered experiences they’re building for the users and not on all of these common cross cutting requirements that are needed when running AI at scale,” Palladino explained.
Kong’s Konnect platform, a unified control plane, is a critical piece of this transformation. It offers a comprehensive view of APIs and AI models, empowering development and data teams to manage gateways, service meshes and AI infrastructures more effectively. The introduction of a service catalog further enhances API visibility, giving businesses the tools to streamline their API portfolios and create new products and experiences with greater ease, according to Palladino.
“Konnect provides [us] with that unified end-to-end control plane that allows us to manage the gateways, the meshes, the ingress controllers, the AI infrastructure that we’re deploying from a single pane of glass,” he said. “From that single pane of glass, we’re also providing added capabilities for analytics, for service cataloging, for being able to productize our APIs and AI models as if there were products themselves.”
Kong’s developer platform, Insomnia, is also evolving with expanded capabilities including the collection runner, enabling developers to test APIs at scale. Integrated AI features, including semantic caching, are designed to simplify the development process, allowing developers to focus on innovation rather than managing the complexities of infrastructure, Palladino explained.
“We made some advancements when it comes to Insomnia, which is our developer platform for consuming APIs, building, designing, mocking APIs,” he said. “Insomnia is expanding into developer infrastructure. It’s a new category. It’s a new horizon for Insomnia and we’re essentially making available a subset of the capabilities we ship for AI gateway, like the semantic caching and the semantic security.”
Here’s the complete interview with Maro Palladino:
(* Disclosure: Kong Inc. sponsored this segment of theCUBE. Neither Kong nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Image: SiliconANGLE/Canva
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU