generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

gpt2_finetuned_10000recipe_chicken

This model is a fine-tuned version of gpt2 on an nlg dataset from https://github.com/Glorf/recipenlg/tree/main which has been subset into recipes containing chicken. It achieves the following results on the evaluation set:

Model description

This model is a fine-tuned version of gpt2 using 10,000 chicken recipes extracted from nlg dataset. <br> It achieves the following results on the evaluation set:

Intended uses & limitations

The use is for personal and educational purposes.

Training and evaluation data

The model uses 10043 recipes for its training data and 100 recipes for its evaluation data.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss
1.866 1.0 2511 1.7299
1.5425 2.0 5022 1.6135
1.3647 3.0 7533 1.5802

Framework versions

Reference

@inproceedings{bien-etal-2020-recipenlg, title = "{R}ecipe{NLG}: A Cooking Recipes Dataset for Semi-Structured Text Generation", author = "Bie{'n}, Micha{\l} and Gilski, Micha{\l} and Maciejewska, Martyna and Taisner, Wojciech and Wisniewski, Dawid and Lawrynowicz, Agnieszka", booktitle = "Proceedings of the 13th International Conference on Natural Language Generation", month = dec, year = "2020", address = "Dublin, Ireland", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.inlg-1.4", pages = "22--28", }