Vicuna-7B

LLaMA weights(llama-7b-hf) + Vicuna weights (vicuna-7b-delta-v1.1) = Vicuna-7B

How to

pip3 install fschat<br> pip3 install git+https://github.com/huggingface/transformers<br> sudo apt install git git-lfs<br> git clone https://huggingface.co/myaniu/Vicuna-7B<br> python3 -m fastchat.serve.cli --model-path /path/to/Vicuna-7B<br>