Ecosystem & Stack: podman
Developer Tool for AI Model Serving
docker
2.7k
containers/ramalama
RamaLama simplifies the local serving and production inference of AI models from any source by leveraging familiar container patterns, eliminating complex host system configurations.
Local AI Model Management Proxy
Docker
3.2k
mostlygeek/llama-swap
A high-performance Go-based proxy for hot-swapping and managing multiple local generative AI models compatible with OpenAI and Anthropic APIs.