

From consumer products to industrial and enterprise use cases, artificial intelligence is asserting itself as the next technology frontier and innovation area. In the same vein, Kong Inc. has made a significant stride by open-sourcing its AI Gateway, aiming to streamline AI application development and foster responsible AI adoption across organizations.
“In the past few months, we worked with developers and organizations that started building AI to ship in their products to build new experiences,” said Marco Palladino (pictured), chief technology officer of Kong. “We noticed that developers keep doing the same things over and over again, so we thought that there could be an opportunity for us to provide modern AI infrastructure to accelerate their productivity as they’re building these new applications.”
Palladino spoke with theCUBE Research executive analyst John Furrier, during a CUBE Conversation from SiliconANGLE Media’s livestreaming studio in Palo Alto. They discussed how Kong plans to address the pressing challenge organizations face in setting up AI infrastructure and turning AI solutions into products. (* Disclosure below.)
With the proliferation of AI technologies and the growing need to integrate multiple large language models, developers are encountering hurdles in streamlining their AI workflows. Kong’s AI Gateway emerges as a solution to enhance productivity and provide visibility into AI traffic, crucial for organizations venturing into AI development.
The just-announced plugins for Kong Gateway 3.6 paves the way for developers who want to integrate multiple LLMs into their products to ship those AI capabilities faster, while simultaneously empowering architects and platforms with a secure solution with overarching visibility, control and compliance on every AI request.
“With the AI Gateway, what we want to provide is a set of capabilities out of the box for multi-LLM consumption, for security, for credentials, for prompt engineering,” Palladino said. “[But] at the same time also give visibility to the architects and the platform teams on what is the AI traffic that’s being generated to incentivize responsible usage of AI from the organization.”
In addition to multi-LLM integration, Gateway 3.6 brings a slew of new, user-focused capabilities, including central AI credential management, no-code AI integrations, AI prompt decoration, AI prompt firewall and comprehensive AI egress with extensive features.
A notable aspect of the Gateway is the introduction of no-code AI plugins, which enable developers to augment existing API traffic with AI capabilities without writing any code. This innovation simplifies AI integration, making it accessible to a broader range of users, including those with limited coding experience. These plugins offer a myriad of use cases, from language translation to real-time analytics, empowering organizations to leverage AI effortlessly.
“There are two plugins that I’m quite excited about, and these are the AI Requests and Response Transformer plugins,” Palladino said. “These plugins allow us to integrate AI in our existing API traffic without having to write any code. Up until now, all the other plugins imply that, as a developer, I’m building an integration for AI and I am writing code to make that happen. But with these no-code AI plugins, we can get the benefit of AI on top of existing API traffic without having to write any line of code.”
The emergence of a multi-LLM world, where organizations utilize both cloud-based and self-hosted LLMs, is inevitable, according to Palladino. Seeing this, Kong is abstracting the complexity of LLM integration and empowering organizations to harness the full potential of AI while minimizing operational overhead.
“First and foremost, the AI Gateway is fully open source, so it’s free to use,” he said. “Anybody can go and download or run it in the cloud for free. It’s not a commercial product offering. It’s an open-source offering. Kong itself has an ecosystem of plugins that allow it to expand what the product does. We have built plugins, six plugins that allow us to perform Level 7 AI operations on top of any type of AI traffic the developers are generating.”
As AI continues to reshape industries, there’s an industry-wide need to lay out the future of AI integration and its implications for developers and organizations. Organizations must adopt a playbook approach to adoption, where they define clear processes and guidelines for deploying AI solutions, according to Palladino. By focusing on end-user needs and outcomes, organizations can leverage AI to create transformative experiences while minimizing risks and maximizing productivity.
Here’s theCUBE’s complete video interview with Palladino:
(* Disclosure: Kong sponsored this segment of theCUBE. Neither Kong nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
THANK YOU