# APIPark > Open-source LLM gateway that provides unified API compatibility, multi-LLM management, load balancing, and fine-grained traffic controls for production deployments. APIPark is an open-source LLM gateway and API platform that enables unified management and invocation of multiple large language models in production. It provides an OpenAI-compatible API signature so existing integrations work without code changes, and offers tools for load balancing, quota management, and developer portals. APIPark focuses on stability, security, and operational controls for enterprise LLM use while remaining deployable via source code and a simple CLI. - **Unified API signature** — Use the OpenAI-compatible API signature to connect multiple LLM providers without changing existing client code; get started by deploying the APIPark gateway and pointing your client to its endpoint. - **Multi-LLM management & load balancing** — Route requests across many LLMs with configurable load balancing and priority rules to improve resilience and performance; configure backends and weights in the management dashboard or config files. - **Fine-grained traffic control** — Define quotas and priorities per tenant or model to limit costs and enforce SLAs; set quotas through the dashboard or tenant configuration APIs. - **Prompt management & API conversion** — Create and manage prompt templates and convert prompts plus models into shareable APIs for developer teams; edit templates in the portal and publish them as APIs. - **Security & access control** — Built-in API authentication and access management protect internal APIs and LLM usage; enable access control and billing options during deployment. - **Observability & billing** — Monitor LLM traffic and usage with dashboards and optional API billing to monetize or track API consumption. - **Semantic caching (coming soon)** — Planned semantic caching features aim to reduce upstream LLM calls and lower latency for common queries. To get started, deploy APIPark from source using the provided CLI command, register your LLM backends (or use the built-in OpenAI-compatible connector), and configure routing, quotas, and prompt templates via the management UI or configuration files. ## Features - Open-source LLM gateway - Unified OpenAI-compatible API signature - Multi-LLM routing and load balancing - Fine-grained quotas and traffic control - Prompt template management and API conversion - Built-in API authentication and access control - Monitoring dashboards and API billing - Semantic caching (planned) ## Integrations 200+ LLM providers (via connectors), OpenAI-compatible APIs, GitHub (repository and issues), Developer portals / API consumers ## Platforms DEVELOPER_SDK ## Pricing Open Source ## Version 1.8 ## Links - Website: https://apipark.com/ - Documentation: https://docs.apipark.com/docs/release - Repository: https://github.com/APIParkLab/APIPark - EveryDev.ai: https://www.everydev.ai/tools/apipark