Cabra: A portuguese finetuned instruction commercial model

LoRA adapter created with the procedures detailed at the GitHub repository: https://github.com/gustrd/cabra .

This training was done at 1 epoch using P100 at Kaggle, by around 11 hours, at a random slice of the dataset.

This LoRA adapter was created following the procedure:

Training procedure

The following bitsandbytes quantization config was used during training:

Framework versions