NLP Library
2.8k 2026-04-18

adapter-hub/adapters

A unified library for parameter-efficient and modular transfer learning, extending HuggingFace Transformers with various adapter methods.

Core Features

Integrates 10+ adapter methods into 20+ state-of-the-art Transformer models.
Provides a unified interface for efficient fine-tuning and modular transfer learning.
Supports full-precision and quantized training methods (e.g., Q-LoRA).
Enables advanced adapter composition via merging and composition blocks.
Seamlessly integrates with HuggingFace's Transformers library.

Quick Start

pip install -U adapters

Detailed Introduction

Adapters is an essential add-on library for HuggingFace's Transformers, designed to streamline parameter-efficient and modular transfer learning in NLP. It integrates a wide array of adapter methods into numerous Transformer models, offering a unified interface for tasks like efficient fine-tuning and advanced adapter composition. By supporting both full-precision and quantized training, Adapters empowers researchers and developers to explore cutting-edge techniques in transfer learning with minimal coding overhead, making complex NLP model adaptation more accessible and resource-efficient.

OSS Alternative

Explore the best open source alternatives to commercial software.

© 2026 OSS Alternative. hotgithub.com - All rights reserved.