This project has open-sourced the Chinese LLaMA model and the Alpaca large model with instruction fine-tuning to further promote the open research of large models in the Chinese NLP community. Based on the original LLaMA , these models expand the Chinese vocabulary and use Chinese data for secondary pre-training, which further improves the basic semantic understanding of Chinese. At the same time, the Chinese Alpaca model further uses Chinese instruction data for fine-tuning, which significantly improves the model's ability to understand and execute instructions.

Features

  • Expanded the Chinese vocabulary for the original LLaMA model, improving the efficiency of Chinese encoding and decoding
  • Open source Chinese LLaMA pre-trained with Chinese text data and Chinese Alpaca with fine-tuned instructions
  • Open source pre-training scripts and instruction fine-tuning scripts, users can further train the model as needed
  • Quickly use the CPU/GPU of a laptop (personal PC) to quantify and deploy a large model locally
  • Support transformers, llama.cpp, text-generation-webui, LlamaChat, LangChain, privateGPT and other
  • Currently open source model versions: 7B (Basic, Plus , Pro ), 13B (Basic, Plus, Pro ) , 33B (Basic, Plus , Pro )
  • Generate performance evaluation

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow Chinese-LLaMA-Alpaca-2 v2.0

Chinese-LLaMA-Alpaca-2 v2.0 Web Site

Other Useful Business Software
Try Google Cloud Risk-Free With $300 in Credit Icon
Try Google Cloud Risk-Free With $300 in Credit

No hidden charges. No surprise bills. Cancel anytime.

Use your credit across every product. Compute, storage, AI, analytics. When it runs out, 20+ products stay free. You only pay when you choose to.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Chinese-LLaMA-Alpaca-2 v2.0!

Additional Project Details

Programming Language

Python

Related Categories

Python Large Language Models (LLM), Python AI Models

Registered

2023-08-21