Kong’s API management platform adds hybrid deployment options
Application programming interface management company Kong Inc. said today it’s updating its flagship Kong Enterprise platform, adding new multicloud and multiregion deployment options,.
It’s also offering new native service mesh support that it says will help its customers to streamline their information technology infrastructure and reduce overhead.
Kong Enterprise is an API management platform that helps support developers building new applications and services. The platform enables end-to-end service lifecycle management for applications and services, from preproduction to postproduction, the company says.
It works by exposing those services and legacy applications as APIs. It also helps scale up and secure APIs as developers rebuild apps on a microservices-based architecture. The platform incorporates artificial intelligence and machine learning technologies to automate many of its processes.
APIs are the preferred method of exposing data and services when building microservices-based apps in software containers that can be extended across different computing environments. APIs are used to connect different services, making it possible, for example, to book a flight using an app and have that reservation appear automatically in Google Calendar.
Kong Enterprise 2.1 introduces a new “Hybrid Mode” deployment option that helps developers use declarative configurations to deploy cloud-native data planes, or software that processes data requests, across multiple clouds and data center environments. This enables those data planes to be managed via a central control plane, Kong said.
Demetry Zilberg, a senior vice president at Kong customer FactSet Research Systems Inc., said the new Hybrid Mode will help his company to deploy API endpoints across several cloud providers. “This ability will provide the multicloud and multiregion flexibility we need to scale effectively and provide a more seamless experience for clients,” Zilberg said.
Kong said its platform now provides native support for Kong Mesh, which is a service mesh built atop of the open-source Kuma and Envoy projects. Kong Mesh is used to create an abstraction layer across networking environments that helps to make it easier to deploy distributed applications within them. It works by automatically routing application traffic across multiple network underlays, thereby doing away with the need to configure each application service for a specific network.
Other updates announced today include an integration with new Kong Studio plugins that brings more automation capabilities to the API lifecycle, the ability to build new Kong plugins in the Golang programming language, an improved app registration process and enhanced visibility and reporting via the Kong Manager dashboard.
“As technology teams build services and applications on Kubernetes and other platforms that span multiple availability zones, data centers and clouds, they need to ensure that service connectivity is reliable and performant,” Kong co-founder and Chief Technology Officer Marco Palladino said in a statement. “Kong Enterprise is the only enterprise-grade platform providing an API gateway, Kubernetes Ingress Controller and service mesh, with the ability to run data planes decoupled from their control planes.”
Image: Kong
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU