Tags: #local-llm

Technical Tutorial
ollama
2.4k

datawhalechina/handy-ollama

A comprehensive guide to deploying large language models (LLMs) locally on CPU using Ollama, making advanced AI accessible without powerful GPUs.

Native macOS AI Assistant
llama.cpp
3.2k

johnbean393/Sidekick

A native macOS app that allows users to chat with a local LLM, responding with information from files, folders, and websites on your Mac, ensuring privacy and offline functionality.

OSS Alternative

Explore the best open source alternatives to commercial software.

© 2026 OSS Alternative. hotgithub.com - All rights reserved.