AI/NLP Model Framework
3.1k 2026-04-18

dbiir/UER-py

UER-py is an open-source PyTorch-based framework for pre-training and fine-tuning NLP models, offering modularity, extensibility, and a comprehensive model zoo.

Core Features

Reproducibility across various pre-training models (BERT, GPT-2, ELMo, T5).
Modular architecture for flexible model construction (embedding, encoder, target).
Supports CPU, single GPU, and distributed training modes.
Provides a rich model zoo of pre-trained NLP models.
Offers abundant functions for pre-training, feature extraction, and text generation.

Detailed Introduction

UER-py (Universal Encoder Representations) is a robust open-source toolkit built on PyTorch, designed to streamline the pre-training of NLP models on general-domain corpora and their subsequent fine-tuning for downstream tasks. It emphasizes model modularity and extensibility, allowing researchers and developers to easily combine components to build custom pre-training models or leverage its extensive model zoo. The framework ensures reproducibility with leading models like BERT and T5, supports various training modes, and provides essential functions for NLP development, making it a valuable resource for advancing natural language processing research and applications.

OSS Alternative

Explore the best open source alternatives to commercial software.

© 2026 OSS Alternative. hotgithub.com - All rights reserved.