Ecosystem & Stack: cmake
Node.js Library for Local AI Inference
Node.js
2.0k
withcatai/node-llama-cpp
A Node.js library providing bindings for llama.cpp, enabling local AI model inference with advanced features like JSON schema enforcement and function calling.
Replaces:
Details Mobile AI Application
Android
2.2k
xororz/local-dream
Run Stable Diffusion directly on Android devices, leveraging Snapdragon NPU for accelerated image generation, with fallback to CPU/GPU.
C Library, Command-line Utility Suite
c
3.5k
libarchive/libarchive
A portable C library and suite of command-line tools for reading, writing, and manipulating a wide array of archive and compression formats.