AI Gateway & LLM Orchestration Platform
43.1k 2026-04-13

BerriAI/litellm

A Python SDK and AI Gateway unifying access to 100+ LLM APIs in an OpenAI-compatible format, offering cost tracking, load balancing, and guardrails.

Core Features

Unified API for 100+ LLMs (OpenAI format)
Cost tracking and management
Intelligent load balancing across providers
Guardrails for secure LLM usage
Self-hosted or enterprise-ready proxy server

Quick Start

pip install 'litellm[proxy]' && litellm --model gpt-4o

Detailed Introduction

LiteLLM is an open-source AI Gateway and Python SDK designed to simplify the integration and management of over 100 large language models (LLMs). It provides a unified API interface, compatible with OpenAI's format, allowing developers to seamlessly switch between various providers like OpenAI, Anthropic, Azure, and Google Vertex AI. Beyond simple API calls, LiteLLM offers critical enterprise features such as cost tracking, intelligent load balancing, and robust guardrails, ensuring efficient, secure, and scalable LLM operations. It can be self-hosted or deployed as an enterprise-ready proxy server, making it an essential tool for building resilient AI applications.

OSS Alternative

Explore the best open source alternatives to commercial software.

© 2026 OSS Alternative. hotgithub.com - All rights reserved.