Self-hosted AI Development Environment
3.7k 2026-04-18

coleam00/local-ai-packaged

A Docker Compose template for quickly bootstrapping a self-hosted local AI and low-code development environment with integrated tools like Ollama, Supabase, and n8n.

Core Features

Integrated Local AI Stack: Bundles Ollama for LLMs, Open WebUI for chat, and Supabase for database/vector store.
Low-Code AI Workflow Automation: Leverages n8n and Flowise for building and managing AI agents and workflows.
Comprehensive AI Tooling: Includes Qdrant (vector store), Neo4j (knowledge graph), Langfuse (observability), and SearXNG (metasearch).
Self-Hosted & Privacy-Focused: Designed for local deployment with Caddy for managed HTTPS, ensuring data privacy.
Rapid Deployment: Utilizes Docker Compose for quick setup of a complex multi-service environment.

Quick Start

git clone -b stable https://github.com/coleam00/local-ai-packaged.git

Detailed Introduction

This project provides a robust, self-hosted AI development environment packaged as a Docker Compose template. It integrates a suite of powerful open-source tools, including Ollama for running local large language models, Open WebUI for an intuitive chat interface, and Supabase for database, vector store, and authentication needs. Additionally, it incorporates n8n and Flowise for low-code AI agent development, Qdrant and Neo4j for advanced data handling, and Langfuse for LLM observability. Designed for privacy and rapid deployment, it empowers developers to build and manage sophisticated AI workflows locally without relying on external cloud services.

OSS Alternative

Explore the best open source alternatives to commercial software.

© 2026 OSS Alternative. hotgithub.com - All rights reserved.