Original description

https://wandb.ai/open-assistant/supervised-finetuning/runs/i9gmn0dt

Trained with residual dropout 0.1

What is this

This is https://huggingface.co/dvruette/llama-13b-pretrained-dropout quantized to int4, groupsize 128.

Run in text-generation-webui with --wbits 4 and --groupsize 128.