instruct dolly_hhrlhf

bart-base-instruct: dolly_hhrlhf

<a href="https://colab.research.google.com/gist/pszemraj/a0c0a8cc24abfbf609f75f9d5c56c348/bart-base-instruct-example.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a>

This model is a fine-tuned version of facebook/bart-base on the pszemraj/dolly_hhrlhf-text2text dataset.

Model description

text2text models fine-tuned on a modified dataset for text2text generation based on the relatively more permissive mosaicml/dolly_hhrlhf dataset.

Basic usage in Python:

# pip install -q transformers accelerate
from transformers import pipeline, GenerationConfig

model_name = "pszemraj/bart-base-instruct-dolly_hhrlhf"
assistant = pipeline(
    "text2text-generation",
    model_name,
    device_map="auto"
)
cfg = GenerationConfig.from_pretrained(model_name)

# pass an 'instruction' as the prompt to the pipeline
prompt = "Write a guide on how to become a ninja while working a 9-5 job."
result = assistant(prompt, generation_config=cfg)[0]["generated_text"]
print(result)

using the generation config is optional, can subsitute with other generation params.

Intended uses & limitations

Training procedure

Training hyperparameters

The following hyperparameters were used during training: