<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
nllb-200-distilled-600M_finetune_W2F_Epochs80_wo_to_fr
This model is a fine-tuned version of facebook/nllb-200-distilled-600M on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.1287
- Bleu: 32.7968
- Gen Len: 35.7766
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
---|---|---|---|---|---|
1.7885 | 1.0 | 1216 | 1.5095 | 28.3741 | 34.9667 |
1.543 | 2.0 | 2432 | 1.4006 | 29.9683 | 34.5907 |
1.3809 | 3.0 | 3648 | 1.3494 | 30.7873 | 35.3996 |
1.2903 | 4.0 | 4864 | 1.3204 | 31.2395 | 34.7558 |
1.2145 | 5.0 | 6080 | 1.3011 | 31.8681 | 35.0264 |
1.1339 | 6.0 | 7296 | 1.2893 | 32.4121 | 35.0703 |
1.08 | 7.0 | 8512 | 1.2861 | 32.4471 | 35.019 |
1.0433 | 8.0 | 9728 | 1.2863 | 32.6646 | 34.9043 |
0.9839 | 9.0 | 10944 | 1.2800 | 32.6412 | 34.803 |
0.9404 | 10.0 | 12160 | 1.2857 | 32.9436 | 34.9949 |
0.8992 | 11.0 | 13376 | 1.2866 | 32.901 | 35.2669 |
0.8572 | 12.0 | 14592 | 1.3008 | 32.8254 | 34.9056 |
0.8202 | 13.0 | 15808 | 1.3091 | 33.0675 | 34.9676 |
0.7831 | 14.0 | 17024 | 1.3230 | 33.0677 | 35.3862 |
0.7484 | 15.0 | 18240 | 1.3233 | 33.2385 | 35.3168 |
0.7156 | 16.0 | 19456 | 1.3423 | 33.2877 | 35.1281 |
0.6889 | 17.0 | 20672 | 1.3562 | 33.006 | 35.1892 |
0.652 | 18.0 | 21888 | 1.3655 | 33.1259 | 35.1776 |
0.639 | 19.0 | 23104 | 1.3860 | 33.2795 | 35.4246 |
0.6128 | 20.0 | 24320 | 1.4037 | 33.1334 | 35.4912 |
0.5754 | 21.0 | 25536 | 1.4161 | 33.0285 | 35.093 |
0.5593 | 22.0 | 26752 | 1.4256 | 33.1544 | 35.3011 |
0.5263 | 23.0 | 27968 | 1.4445 | 32.7697 | 35.3922 |
0.5126 | 24.0 | 29184 | 1.4583 | 32.8335 | 35.4898 |
0.4869 | 25.0 | 30400 | 1.4716 | 32.8928 | 35.5486 |
0.4729 | 26.0 | 31616 | 1.4913 | 33.2121 | 35.5079 |
0.4449 | 27.0 | 32832 | 1.5008 | 33.2267 | 35.4431 |
0.4418 | 28.0 | 34048 | 1.5228 | 32.7818 | 35.5222 |
0.4209 | 29.0 | 35264 | 1.5380 | 32.7744 | 35.6364 |
0.3947 | 30.0 | 36480 | 1.5543 | 32.5445 | 35.5846 |
0.3811 | 31.0 | 37696 | 1.5741 | 32.8464 | 35.4894 |
0.3613 | 32.0 | 38912 | 1.5889 | 32.7933 | 35.8039 |
0.3661 | 33.0 | 40128 | 1.6032 | 32.7548 | 35.8043 |
0.3418 | 34.0 | 41344 | 1.6284 | 32.664 | 35.7095 |
0.3188 | 35.0 | 42560 | 1.6402 | 32.741 | 35.8571 |
0.3124 | 36.0 | 43776 | 1.6481 | 32.5202 | 35.7077 |
0.2954 | 37.0 | 44992 | 1.6757 | 32.5277 | 35.5153 |
0.2822 | 38.0 | 46208 | 1.6815 | 32.5784 | 35.6809 |
0.2764 | 39.0 | 47424 | 1.6996 | 32.7318 | 35.4223 |
0.2656 | 40.0 | 48640 | 1.7119 | 32.7212 | 35.5555 |
0.2486 | 41.0 | 49856 | 1.7288 | 32.6495 | 35.6984 |
0.2444 | 42.0 | 51072 | 1.7396 | 32.6403 | 35.7391 |
0.2365 | 43.0 | 52288 | 1.7621 | 32.4874 | 35.5555 |
0.2239 | 44.0 | 53504 | 1.7644 | 32.4727 | 35.4635 |
0.2132 | 45.0 | 54720 | 1.7850 | 32.4659 | 35.6767 |
0.2096 | 46.0 | 55936 | 1.7977 | 32.3921 | 35.5759 |
0.2004 | 47.0 | 57152 | 1.8120 | 32.4906 | 35.5939 |
0.1935 | 48.0 | 58368 | 1.8262 | 32.6267 | 35.6554 |
0.1821 | 49.0 | 59584 | 1.8447 | 32.8145 | 35.9695 |
0.1766 | 50.0 | 60800 | 1.8489 | 32.5806 | 35.7373 |
0.1722 | 51.0 | 62016 | 1.8604 | 32.4795 | 35.839 |
0.1661 | 52.0 | 63232 | 1.8711 | 32.4596 | 35.6545 |
0.1597 | 53.0 | 64448 | 1.8772 | 32.741 | 35.9167 |
0.1491 | 54.0 | 65664 | 1.9015 | 32.3714 | 35.6355 |
0.1475 | 55.0 | 66880 | 1.9070 | 32.2494 | 35.4029 |
0.142 | 56.0 | 68096 | 1.9174 | 32.5657 | 35.6526 |
0.1366 | 57.0 | 69312 | 1.9218 | 32.469 | 35.7123 |
0.1306 | 58.0 | 70528 | 1.9304 | 32.4912 | 35.6508 |
0.1283 | 59.0 | 71744 | 1.9425 | 32.6643 | 35.7414 |
0.1238 | 60.0 | 72960 | 1.9528 | 32.691 | 35.6406 |
0.1205 | 61.0 | 74176 | 1.9658 | 32.7217 | 35.8071 |
0.1153 | 62.0 | 75392 | 1.9725 | 32.6433 | 35.7095 |
0.1124 | 63.0 | 76608 | 1.9766 | 32.7148 | 35.7919 |
0.1105 | 64.0 | 77824 | 1.9925 | 32.6798 | 35.704 |
0.1081 | 65.0 | 79040 | 2.0005 | 32.5732 | 35.7206 |
0.1028 | 66.0 | 80256 | 2.0081 | 32.6728 | 35.728 |
0.101 | 67.0 | 81472 | 2.0147 | 32.5964 | 35.5874 |
0.0973 | 68.0 | 82688 | 2.0206 | 32.5034 | 35.9237 |
0.0951 | 69.0 | 83904 | 2.0317 | 32.6808 | 35.7336 |
0.0925 | 70.0 | 85120 | 2.0321 | 32.4508 | 35.7442 |
0.0889 | 71.0 | 86336 | 2.0367 | 32.6116 | 35.7165 |
0.0882 | 72.0 | 87552 | 2.0434 | 32.543 | 35.7391 |
0.0871 | 73.0 | 88768 | 2.0501 | 32.6749 | 35.7673 |
0.0847 | 74.0 | 89984 | 2.0576 | 32.6153 | 35.6272 |
0.0818 | 75.0 | 91200 | 2.0570 | 32.5598 | 35.9635 |
0.0795 | 76.0 | 92416 | 2.0687 | 32.762 | 35.6499 |
0.0789 | 77.0 | 93632 | 2.0713 | 32.7033 | 35.7396 |
0.0767 | 78.0 | 94848 | 2.0710 | 32.6167 | 35.8326 |
0.0757 | 79.0 | 96064 | 2.0825 | 32.5059 | 35.691 |
0.0735 | 80.0 | 97280 | 2.0839 | 32.7008 | 35.8497 |
0.0739 | 81.0 | 98496 | 2.0891 | 32.6755 | 36.0217 |
0.0725 | 82.0 | 99712 | 2.0937 | 32.6401 | 35.8719 |
0.0709 | 83.0 | 100928 | 2.1011 | 32.5576 | 35.7738 |
0.0688 | 84.0 | 102144 | 2.0981 | 32.5839 | 35.6799 |
0.0684 | 85.0 | 103360 | 2.1025 | 32.5796 | 35.7877 |
0.0679 | 86.0 | 104576 | 2.1062 | 32.5211 | 35.8085 |
0.0663 | 87.0 | 105792 | 2.1078 | 32.5702 | 35.7738 |
0.0652 | 88.0 | 107008 | 2.1157 | 32.585 | 35.8039 |
0.0646 | 89.0 | 108224 | 2.1149 | 32.5737 | 35.9056 |
0.0653 | 90.0 | 109440 | 2.1166 | 32.532 | 35.9644 |
0.0644 | 91.0 | 110656 | 2.1187 | 32.5873 | 35.7239 |
0.0631 | 92.0 | 111872 | 2.1183 | 32.6478 | 35.8871 |
0.0635 | 93.0 | 113088 | 2.1205 | 32.7319 | 35.7932 |
0.0639 | 94.0 | 114304 | 2.1228 | 32.8733 | 35.8622 |
0.0614 | 95.0 | 115520 | 2.1260 | 32.865 | 35.7794 |
0.0603 | 96.0 | 116736 | 2.1270 | 32.7012 | 35.7604 |
0.0623 | 97.0 | 117952 | 2.1264 | 32.7231 | 35.7826 |
0.0607 | 98.0 | 119168 | 2.1270 | 32.8368 | 35.7882 |
0.0607 | 99.0 | 120384 | 2.1283 | 32.7617 | 35.8025 |
0.06 | 100.0 | 121600 | 2.1287 | 32.7968 | 35.7766 |
Framework versions
- Transformers 4.33.1
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3