Cabra: A portuguese finetuned instruction Open-LLaMA

LoRA adapter created with the procedures detailed at the GitHub repository: https://github.com/gustrd/cabra .

This training was done at 2 epochs using two T4 at Kaggle.

This LoRA adapter was created following the procedure:

Training procedure

The following bitsandbytes quantization config was used during training:

Framework versions