mudler/LocalAI
An open-source AI engine that allows running various AI models (LLMs, vision, voice, image, video) locally on any hardware, including CPU-only, with drop-in API compatibility for commercial services.
Core Features
Quick Start
docker run -ti --name local-ai -p 8080:8080 localai/localai:latestDetailed Introduction
LocalAI is a powerful open-source AI engine designed to bring advanced AI capabilities directly to your local infrastructure. It supports a wide array of AI models, including large language models, vision, voice, image, and video models, and can run them efficiently on virtually any hardware, even without a dedicated GPU. Its key strength lies in offering drop-in API compatibility with popular commercial services like OpenAI, Anthropic, and ElevenLabs, enabling seamless integration for developers. With multi-user support, robust privacy features, and built-in AI agents, LocalAI empowers users to build and deploy private, versatile, and scalable AI applications.