generated_from_trainer

bart-large-finetuned-arxiv-co-ga-latest

Model description

This model (v1.0) is a fine-tuned version of facebook/bart-large. The purpose of this model is to generate titles given an abstract. It was trained on Astronomy arXiv papers tagged 'CO' (Cosmology and Nongalactic Astrophysics) as well as 'GA' (Astrophysics of Galaxies).

Code for this project can be found on GitHub.

👉🏽 Feel free to interact with the model here and use it to generate a title given your abstract! 👈🏽

<!-- ## Intended uses & limitations

More information needed -->

Training and evaluation data

The dataset used for training consists of abstract+title pairs from arXiv and was obtained from Kaggle. Training was performed on 79,727 abstract+title pairs and validation was done on 9966 abstract+title pairs.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
1.7752 1.0 9966 1.7190 43.8916 23.6296 38.229 39.3519

Framework versions