Tags: #inference-engine
Deep Learning Inference Engine
14.9k
alibaba/MNN
A blazing-fast, lightweight inference engine from Alibaba, powering high-performance on-device LLMs and Edge AI.
Machine Learning Compiler and LLM Deployment Engine
22.3k
mlc-ai/mlc-llm
A universal machine learning compiler and high-performance engine for deploying large language models efficiently across diverse hardware platforms.