<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
opus-mt-id-en-ccmatrix-warmup
This model is a fine-tuned version of Helsinki-NLP/opus-mt-id-en on the ccmatrix dataset. It achieves the following results on the evaluation set:
- Loss: 0.9770
- Bleu: 56.6698
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 4000
- num_epochs: 25
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu |
---|---|---|---|---|
0.8166 | 1.0 | 28125 | 0.8166 | 50.952 |
0.705 | 2.0 | 56250 | 0.7827 | 52.0343 |
0.6442 | 3.0 | 84375 | 0.7708 | 52.6943 |
0.5991 | 4.0 | 112500 | 0.7636 | 52.9969 |
0.5613 | 5.0 | 140625 | 0.7580 | 53.5542 |
0.528 | 6.0 | 168750 | 0.7584 | 53.9303 |
0.4976 | 7.0 | 196875 | 0.7577 | 54.3562 |
0.469 | 8.0 | 225000 | 0.7588 | 54.599 |
0.4418 | 9.0 | 253125 | 0.7652 | 54.6941 |
0.4161 | 10.0 | 281250 | 0.7736 | 54.8075 |
0.3912 | 11.0 | 309375 | 0.7826 | 55.2852 |
0.3675 | 12.0 | 337500 | 0.7928 | 55.4362 |
0.3441 | 13.0 | 365625 | 0.8066 | 55.1959 |
0.3221 | 14.0 | 393750 | 0.8139 | 55.6136 |
0.3003 | 15.0 | 421875 | 0.8313 | 55.7404 |
0.2795 | 16.0 | 450000 | 0.8460 | 55.743 |
0.2596 | 17.0 | 478125 | 0.8559 | 56.0679 |
0.2404 | 18.0 | 506250 | 0.8757 | 55.8564 |
0.2223 | 19.0 | 534375 | 0.8887 | 56.2032 |
0.205 | 20.0 | 562500 | 0.9071 | 56.2576 |
0.1889 | 21.0 | 590625 | 0.9244 | 56.2583 |
0.174 | 22.0 | 618750 | 0.9420 | 56.4147 |
0.1606 | 23.0 | 646875 | 0.9577 | 56.4719 |
0.1488 | 24.0 | 675000 | 0.9692 | 56.5931 |
0.1391 | 25.0 | 703125 | 0.9770 | 56.6698 |
Framework versions
- Transformers 4.26.1
- Pytorch 2.0.0
- Datasets 2.10.1
- Tokenizers 0.11.0