CLI Tool / Local AI Inference Platform
2.8k 2026-04-18

janhq/cortex.cpp

A local AI API platform designed to run various AI models (vision, speech, language) on local hardware with an OpenAI-compatible API.

Core Features

Local AI model inference
Multi-engine support (e.g., llama.cpp)
Hardware optimization (GPU detection)
OpenAI-compatible API
Cross-platform installers (Windows, macOS, Linux)

Quick Start

curl -s https://raw.githubusercontent.com/menloresearch/cortex/main/engine/templates/linux/install.sh | sudo bash

Detailed Introduction

Cortex is an open-source local AI API platform designed to empower users to run a diverse range of AI models, including vision, speech, language, tabular, and action models, directly on their local hardware. It offers robust multi-engine support, initially integrating with `llama.cpp`, and features automatic hardware optimization for various GPUs from NVIDIA, AMD, and Intel. By providing an OpenAI-compatible API, Cortex aims to serve as a flexible and powerful "brain for robots" that operates independently of cloud services. This approach offers a compelling solution for local AI development, experimentation, and deployment, enabling greater privacy, control, and efficiency for AI applications.

OSS Alternative

Explore the best open source alternatives to commercial software.

© 2026 OSS Alternative. hotgithub.com - All rights reserved.