<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
nmt-ted-id-en-lr_1e-2-ep_30-seq_128-bs_64
This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.0751
- Bleu: 16.4354
- Gen Len: 16.3492
- Meteor: 0.3448
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len | Meteor |
---|---|---|---|---|---|---|
3.2366 | 1.0 | 625 | 2.4442 | 7.3157 | 16.9063 | 0.2192 |
2.5208 | 2.0 | 1250 | 2.0785 | 11.3311 | 16.0768 | 0.2869 |
2.1936 | 3.0 | 1875 | 1.8995 | 12.4756 | 16.4486 | 0.2934 |
1.872 | 4.0 | 2500 | 1.8241 | 13.9295 | 16.3092 | 0.3163 |
1.7185 | 5.0 | 3125 | 1.7624 | 14.3797 | 16.3602 | 0.3213 |
1.6177 | 6.0 | 3750 | 1.7049 | 15.2549 | 16.3835 | 0.3304 |
1.5355 | 7.0 | 4375 | 1.7059 | 15.7225 | 16.3599 | 0.3346 |
1.388 | 8.0 | 5000 | 1.6864 | 15.4343 | 16.4646 | 0.3308 |
1.2741 | 9.0 | 5625 | 1.6899 | 16.2174 | 16.3215 | 0.3428 |
1.216 | 10.0 | 6250 | 1.6831 | 16.1891 | 16.2815 | 0.3451 |
1.1486 | 11.0 | 6875 | 1.7137 | 16.3811 | 16.3451 | 0.3435 |
1.0426 | 12.0 | 7500 | 1.7490 | 16.3482 | 16.3791 | 0.343 |
0.9509 | 13.0 | 8125 | 1.7674 | 16.3318 | 16.469 | 0.3436 |
0.9072 | 14.0 | 8750 | 1.8084 | 16.4721 | 16.3064 | 0.3452 |
0.857 | 15.0 | 9375 | 1.8414 | 16.4244 | 16.3718 | 0.3472 |
0.7696 | 16.0 | 10000 | 1.8829 | 16.3755 | 16.3816 | 0.3446 |
0.7066 | 17.0 | 10625 | 1.9325 | 16.4635 | 16.3957 | 0.3459 |
0.6718 | 18.0 | 11250 | 1.9980 | 16.3287 | 16.3124 | 0.3431 |
0.6364 | 19.0 | 11875 | 2.0211 | 16.5732 | 16.3558 | 0.3456 |
0.5835 | 20.0 | 12500 | 2.0751 | 16.4354 | 16.3492 | 0.3448 |
Framework versions
- Transformers 4.20.1
- Pytorch 1.11.0
- Datasets 2.1.0
- Tokenizers 0.12.1