BerriAI/litellm
A Python SDK and AI Gateway unifying access to 100+ LLM APIs in an OpenAI-compatible format, offering cost tracking, load balancing, and guardrails.
Core Features
Quick Start
pip install 'litellm[proxy]' && litellm --model gpt-4oDetailed Introduction
LiteLLM is an open-source AI Gateway and Python SDK designed to simplify the integration and management of over 100 large language models (LLMs). It provides a unified API interface, compatible with OpenAI's format, allowing developers to seamlessly switch between various providers like OpenAI, Anthropic, Azure, and Google Vertex AI. Beyond simple API calls, LiteLLM offers critical enterprise features such as cost tracking, intelligent load balancing, and robust guardrails, ensuring efficient, secure, and scalable LLM operations. It can be self-hosted or deployed as an enterprise-ready proxy server, making it an essential tool for building resilient AI applications.