Local AI Inference Platform
45.1k 2026-04-09

mudler/LocalAI

An open-source AI engine that allows running various AI models (LLMs, vision, voice, image, video) locally on any hardware, including CPU-only, with drop-in API compatibility for commercial services.

Core Features

OpenAI, Anthropic, and ElevenLabs API compatibility
Run diverse AI models (LLMs, vision, voice, image, video) on any hardware (CPU, NVIDIA, AMD, Intel, Apple Silicon, Vulkan)
Multi-user ready with API key authentication, quotas, and role-based access
Privacy-first design ensures data remains within your infrastructure
Built-in AI agents with tool use, RAG, and skills

Quick Start

docker run -ti --name local-ai -p 8080:8080 localai/localai:latest

Detailed Introduction

LocalAI is a powerful open-source AI engine designed to bring advanced AI capabilities directly to your local infrastructure. It supports a wide array of AI models, including large language models, vision, voice, image, and video models, and can run them efficiently on virtually any hardware, even without a dedicated GPU. Its key strength lies in offering drop-in API compatibility with popular commercial services like OpenAI, Anthropic, and ElevenLabs, enabling seamless integration for developers. With multi-user support, robust privacy features, and built-in AI agents, LocalAI empowers users to build and deploy private, versatile, and scalable AI applications.

OSS Alternative

Explore the best open source alternatives to commercial software.

© 2026 OSS Alternative. hotgithub.com - All rights reserved.