About kong
๐ฆ The API and AI Gateway
Kong is a powerful, open-source API and AI Gateway designed for modern cloud-native architectures. It acts as a unified control plane for managing, securing, and orchestrating both traditional microservices and cutting-edge AI/LLM (Large Language Model) traffic. Built with extensibility in mind via Lua plugins, Kong excels as a reverse proxy, Kubernetes ingress controller, and OpenAI proxy, enabling seamless integration of AI capabilities into existing applications. Its unique value lies in bridging DevOps and AI/LLM-Ops, offering features like rate limiting, authentication, and observability for generative AI endpoints. As a free, GitHub-hosted solution, Kong empowers developers to build scalable, secure, and intelligent systems without vendor lock-in, making it an essential tool for anyone deploying APIs or AI models in production.
Common Use Cases
- Centralize and secure access to multiple internal or external AI model APIs (like OpenAI) through a single gateway.
- Implement rate limiting, authentication, and logging for LLM endpoints to control costs and ensure security.
- Manage and route traffic for microservices in Kubernetes environments as a robust ingress controller.
- Develop custom plugins with Lua to extend gateway functionality for specific business or AI workflow needs.
- Create a serverless AI gateway to proxy and optimize requests between applications and cloud AI services.
Not sure how we recommend this tool? Learn about our methodology
Key Features
- Lua
- Open Source
- GitHub Hosted
How to Get Started
Usage Statistics
Active Users
43,094
API Calls
5,107,000
Additional Information
Category
Generative AI
Pricing
Free
Last Updated
4/3/2026