Unified API to call 100+ LLM providers in OpenAI format
LiteLLM is an open-source Python SDK and proxy server that provides a unified interface to call 100+ LLM APIs using an OpenAI-compatible format. Translate inputs and outputs across providers like OpenAI, Anthropic, Azure, Bedrock, Vertex AI, Cohere, and more without rewriting code. Includes spend tracking, budget management per key/user, load balancing, fallback routing, and rate limiting. The proxy server handles API key management, logging, and integrates with observability tools like Langfuse, LangSmith, and OpenTelemetry. Self-host for free or contact sales for Enterprise with SSO, guardrails, and audit logs. Used by Netflix, Lemonade, and RocketMoney. 30,000+ GitHub stars.
Reach thousands of developers actively searching for AI tools. Featured listings get 10x more clicks.