ymcui/Chinese-LLaMA-Alpaca
An open-source project providing Chinese LLaMA and Alpaca large language models, enhanced with Chinese vocabulary and data for improved understanding and local deployment on CPU/GPU.
Core Features
Detailed Introduction
Chinese-LLaMA-Alpaca is a pivotal open-source initiative aimed at advancing large language models within the Chinese NLP community. It enhances the original LLaMA architecture by expanding its Chinese vocabulary and conducting secondary pre-training with Chinese data, significantly boosting its foundational semantic understanding. The project further provides instruction-tuned Alpaca models, which excel in comprehending and executing Chinese commands. It offers comprehensive scripts for training and fine-tuning, alongside robust support for local deployment and quantization on personal hardware, making advanced LLM technology accessible for research and application.