Cabra: A portuguese finetuned instruction Open-LLaMA

LoRA adapter created with the procedures detailed at the GitHub repository: https://github.com/gustrd/cabra .

This training was done at 2 epochs using one A4000 at Paperspace.

The GGML version was created with llama.cpp "convert-lora-to-ggml.py".

This LoRA adapter was created following the procedure


library_name: peft

Training procedure

The following bitsandbytes quantization config was used during training:

Framework versions