summarization generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

mt5-small-finetuned-13f-reports

This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

The model was fine tuned on a dataset of 1000+ quarterly 13F reports. It is intended for use with automating the generation of summaries of articles before they are published. This allows you to put in a TL;DR summary without having to write one on your own.

NOTE: The HuggingFace hosted Inference API interface takes the default parameters and so only outputs about 20 words of text. To get a full summary, use the Inference API directly and pass in max_length=120 or so.

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
11.4662 1.0 126 2.9329 0.2023 0.0998 0.1717 0.1792
3.4401 2.0 252 1.9914 0.3142 0.2573 0.3015 0.3036
2.5139 3.0 378 1.7493 0.3131 0.2576 0.3022 0.3039
2.152 4.0 504 1.6465 0.3114 0.2564 0.3009 0.3024
1.9624 5.0 630 1.5607 0.3202 0.2695 0.3114 0.3127
1.851 6.0 756 1.5163 0.3205 0.2704 0.3101 0.311
1.8002 7.0 882 1.4848 0.3225 0.2718 0.3148 0.3161
1.7864 8.0 1008 1.4818 0.3235 0.2725 0.3146 0.3161

Framework versions