<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
LucaReggiani/t5-small-11nlpfinalproject11-xsum
This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 3.0366
- Validation Loss: 2.9572
- Train Rouge1: 23.0678
- Train Rouge2: 4.8820
- Train Rougel: 18.2146
- Train Rougelsum: 18.0961
- Train Gen Len: 18.73
- Epoch: 9
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 5e-05, 'beta_1': 0.9, 'beta_2': 0.98, 'epsilon': 1e-06, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Validation Loss | Train Rouge1 | Train Rouge2 | Train Rougel | Train Rougelsum | Train Gen Len | Epoch |
---|---|---|---|---|---|---|---|
3.8944 | 3.2778 | 18.4040 | 2.9447 | 14.7588 | 14.8854 | 18.71 | 0 |
3.5118 | 3.1285 | 21.0571 | 4.0329 | 16.5313 | 16.5872 | 18.17 | 1 |
3.3821 | 3.0720 | 21.2823 | 4.1817 | 16.3643 | 16.3809 | 18.38 | 2 |
3.3099 | 3.0368 | 21.3656 | 4.0228 | 16.6094 | 16.5866 | 18.5 | 3 |
3.2464 | 3.0117 | 21.6946 | 4.2746 | 16.7999 | 16.7907 | 18.68 | 4 |
3.2081 | 2.9932 | 23.3785 | 5.3998 | 18.5529 | 18.5770 | 18.6 | 5 |
3.1603 | 2.9809 | 23.2570 | 5.4772 | 18.6532 | 18.6172 | 18.55 | 6 |
3.1169 | 2.9719 | 23.0897 | 4.7919 | 18.2567 | 18.1743 | 18.59 | 7 |
3.0696 | 2.9681 | 22.5213 | 4.9309 | 17.9595 | 17.8530 | 18.6 | 8 |
3.0366 | 2.9572 | 23.0678 | 4.8820 | 18.2146 | 18.0961 | 18.73 | 9 |
Framework versions
- Transformers 4.26.1
- TensorFlow 2.11.0
- Datasets 2.10.0
- Tokenizers 0.13.2