Introduction
This is a model repo mainly for LLamaSharp to provide samples for each version. The models can also be used by llama.cpp or other engines.
Since llama.cpp
always have break changes, it takes much time for users (of LLamaSharp and others) to find a suitable model to run. This model repo would provide some convenience for users.
Models
- [x] LLaMa 7B / 13B
- [ ] Alpaca
- [ ] GPT4All
- [ ] Chinese LLaMA / Alpaca
- [ ] Vigogne (French)
- [ ] Vicuna
- [ ] Koala
- [ ] OpenBuddy 🐶 (Multilingual)
- [ ] Pygmalion 7B / Metharme 7B
- [x] WizardLM (refer to https://huggingface.co/TheBloke/wizardLM-7B-GGML)
We will appreciate it if you'd like to provide some info about the incompleted models (such as links, model sources, etc.).
Usages
At first, choose a branch with the same name of your LLamaSharp Backend version. For example, if you're using LLamaSharp.Backend.Cuda11 v0.3.0
, please use v0.3.0
branch of this repo.
Then download a model you like and follow the instructions of LLamaSharp to run it.
Contributing
Any kind of contribution is welcomed! It's not necessary to upload a model, providing some information can also help a lot! For example, if you know where to download the pth file of Vicuna
, please tell us via community
and we'll add it to the list!