AI/ML Library
3.6k 2026-04-18

SciSharp/LLamaSharp

A cross-platform C#/.NET library for efficient local inference of large language models (LLMs) like LLaMA and LLAVA.

Core Features

Efficient local LLM inference on CPU and GPU (CUDA, Vulkan)
Supports LLaMA and multimodal LLAVA models
Higher-level APIs for easy LLM deployment in .NET applications
Built-in Retrieval Augmented Generation (RAG) support
Integrations with semantic-kernel, kernel-memory, BotSharp, Langchain

Quick Start

dotnet add package LLamaSharp

Detailed Introduction

LLamaSharp is a powerful C#/.NET library designed to bring the capabilities of large language models (LLMs) like LLaMA and LLAVA directly to your local devices. Leveraging the highly optimized llama.cpp project, it ensures efficient inference on both CPU and various GPU architectures (CUDA, Vulkan). The library provides developer-friendly, higher-level APIs, simplifying the integration and deployment of LLMs within .NET applications. With robust RAG support and seamless integrations with popular AI frameworks, LLamaSharp empowers developers to build intelligent applications that run LLMs locally, enhancing privacy and reducing reliance on cloud services.

OSS Alternative

Explore the best open source alternatives to commercial software.

© 2026 OSS Alternative. hotgithub.com - All rights reserved.