generated_from_trainer dolly_hhrlhf flan-instruct

flan-t5-base-instruct: dolly_hhrlhf

<a href="https://colab.research.google.com/gist/pszemraj/6ca2b0adc89f6a001a9ba7bcd4300e85/flan-t5-base-instruct-example.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a>

This model is a fine-tuned version of google/flan-t5-base on the pszemraj/dolly_hhrlhf-text2text dataset.

Model description

text2text models fine-tuned on a modified dataset for text2text generation based on the relatively more permissive mosaicml/dolly_hhrlhf dataset.

Basic usage in Python:

# pip install -q transformers accelerate
import torch
from transformers import pipeline, GenerationConfig

model_name = "pszemraj/flan-t5-base-instruct-dolly_hhrlhf"
assistant = pipeline(
    "text2text-generation",
    model_name,
    device=0 if torch.cuda.is_available() else -1,
)
cfg = GenerationConfig.from_pretrained(model_name)

# pass an 'instruction' as the prompt to the pipeline
prompt = "Write a guide on how to become a ninja while working a 9-5 job."
result = assistant(prompt, generation_config=cfg)[0]["generated_text"]
print(result)

* using the generation config is optional, can subsitute with other generation params.

Intended uses & limitations

Training procedure

Training hyperparameters

The following hyperparameters were used during training: