Tags: #edge-ai
alibaba/MNN
A blazing-fast, lightweight inference engine from Alibaba, powering high-performance on-device LLMs and Edge AI.
openvinotoolkit/openvino
OpenVINO is an open-source toolkit designed to optimize and deploy deep learning models for efficient AI inference across diverse hardware platforms, from edge to cloud.
autodistill/autodistill
Autodistill automates the process of training small, fast supervised models from unlabeled images by leveraging large foundation models, eliminating the need for manual data labeling.
vitoplantamura/OnnxStream
A lightweight C++ inference library for ONNX models, enabling low-memory execution of large AI models like Stable Diffusion XL and Mistral 7B on diverse hardware, from Raspberry Pi Zero 2 to servers.
nullclaw/nullclaw
NullClaw is the fastest, smallest, and fully autonomous AI assistant infrastructure, written in Zig, designed for resource-constrained environments.
memovai/mimiclaw
MimiClaw is an embedded AI assistant platform that runs OpenClaw on a $5 ESP32-S3 chip, offering a tiny, low-power, and self-contained personal AI agent without a traditional OS.