<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
t5-base-finetuned-it-to-en
This model is a fine-tuned version of t5-base on the ccmatrix dataset. It achieves the following results on the evaluation set:
- Loss: 1.7418
- Bleu: 26.0557
- Gen Len: 25.6033
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 40
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
---|---|---|---|---|---|
No log | 1.0 | 282 | 2.0555 | 16.8117 | 26.9573 |
2.3228 | 2.0 | 564 | 1.9791 | 18.207 | 26.754 |
2.3228 | 3.0 | 846 | 1.9340 | 19.2206 | 26.6513 |
2.104 | 4.0 | 1128 | 1.8999 | 20.0802 | 26.5473 |
2.104 | 5.0 | 1410 | 1.8741 | 20.9222 | 26.4633 |
1.9952 | 6.0 | 1692 | 1.8511 | 21.3 | 26.4547 |
1.9952 | 7.0 | 1974 | 1.8361 | 21.9444 | 26.5227 |
1.9032 | 8.0 | 2256 | 1.8191 | 22.224 | 26.168 |
1.8342 | 9.0 | 2538 | 1.8074 | 22.7097 | 26.1573 |
1.8342 | 10.0 | 2820 | 1.7972 | 23.0299 | 26.2373 |
1.7718 | 11.0 | 3102 | 1.7898 | 23.5173 | 26.0447 |
1.7718 | 12.0 | 3384 | 1.7833 | 23.7157 | 26.0073 |
1.7268 | 13.0 | 3666 | 1.7785 | 23.8523 | 25.742 |
1.7268 | 14.0 | 3948 | 1.7725 | 23.979 | 25.88 |
1.6822 | 15.0 | 4230 | 1.7686 | 24.2126 | 25.8347 |
1.6386 | 16.0 | 4512 | 1.7639 | 24.4612 | 25.786 |
1.6386 | 17.0 | 4794 | 1.7605 | 24.6716 | 25.828 |
1.6047 | 18.0 | 5076 | 1.7549 | 24.9392 | 25.6493 |
1.6047 | 19.0 | 5358 | 1.7548 | 24.8965 | 25.6527 |
1.5778 | 20.0 | 5640 | 1.7537 | 24.9908 | 25.7827 |
1.5778 | 21.0 | 5922 | 1.7498 | 25.1397 | 25.6707 |
1.5413 | 22.0 | 6204 | 1.7472 | 25.2764 | 25.7373 |
1.5413 | 23.0 | 6486 | 1.7468 | 25.3103 | 25.6927 |
1.5249 | 24.0 | 6768 | 1.7471 | 25.3128 | 25.698 |
1.5052 | 25.0 | 7050 | 1.7449 | 25.4046 | 25.6813 |
1.5052 | 26.0 | 7332 | 1.7444 | 25.5513 | 25.7833 |
1.4825 | 27.0 | 7614 | 1.7448 | 25.4756 | 25.632 |
1.4825 | 28.0 | 7896 | 1.7432 | 25.6046 | 25.658 |
1.4665 | 29.0 | 8178 | 1.7422 | 25.6138 | 25.6907 |
1.4665 | 30.0 | 8460 | 1.7420 | 25.7196 | 25.7 |
1.4508 | 31.0 | 8742 | 1.7420 | 25.8684 | 25.618 |
1.4394 | 32.0 | 9024 | 1.7420 | 25.8188 | 25.6007 |
1.4394 | 33.0 | 9306 | 1.7417 | 25.9295 | 25.6113 |
1.4318 | 34.0 | 9588 | 1.7421 | 25.9842 | 25.614 |
1.4318 | 35.0 | 9870 | 1.7408 | 26.1045 | 25.5933 |
1.4244 | 36.0 | 10152 | 1.7409 | 26.0496 | 25.6327 |
1.4244 | 37.0 | 10434 | 1.7417 | 26.0595 | 25.6347 |
1.4139 | 38.0 | 10716 | 1.7420 | 26.0515 | 25.6047 |
1.4139 | 39.0 | 10998 | 1.7417 | 26.0727 | 25.616 |
1.4135 | 40.0 | 11280 | 1.7418 | 26.0557 | 25.6033 |
Framework versions
- Transformers 4.22.1
- Pytorch 1.12.1
- Datasets 2.5.1
- Tokenizers 0.11.0