<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-08
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1455
- Wer: 0.0896
- Cer: 0.0251
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
27.8108 | 1.0 | 86 | 3.2125 | 1.0 | 1.0 |
7.0217 | 2.0 | 172 | 2.9839 | 1.0 | 1.0 |
3.033 | 3.0 | 258 | 2.9245 | 1.0 | 1.0 |
2.9324 | 4.0 | 344 | 2.8842 | 1.0 | 1.0 |
2.8713 | 5.0 | 430 | 2.1173 | 1.0 | 0.7026 |
1.963 | 6.0 | 516 | 0.6336 | 0.3492 | 0.0867 |
0.9394 | 7.0 | 602 | 0.4107 | 0.2192 | 0.0562 |
0.9394 | 8.0 | 688 | 0.3376 | 0.1913 | 0.0507 |
0.6379 | 9.0 | 774 | 0.2888 | 0.1698 | 0.0451 |
0.5164 | 10.0 | 860 | 0.2622 | 0.1460 | 0.0416 |
0.4629 | 11.0 | 946 | 0.2398 | 0.1301 | 0.0375 |
0.4032 | 12.0 | 1032 | 0.2196 | 0.1170 | 0.0344 |
0.3776 | 13.0 | 1118 | 0.2118 | 0.1120 | 0.0332 |
0.3289 | 14.0 | 1204 | 0.1965 | 0.1139 | 0.0331 |
0.3289 | 15.0 | 1290 | 0.1912 | 0.1064 | 0.0314 |
0.3063 | 16.0 | 1376 | 0.1836 | 0.1074 | 0.0306 |
0.3058 | 17.0 | 1462 | 0.1839 | 0.1027 | 0.0309 |
0.3 | 18.0 | 1548 | 0.1702 | 0.0985 | 0.0296 |
0.2867 | 19.0 | 1634 | 0.1802 | 0.0994 | 0.0296 |
0.2717 | 20.0 | 1720 | 0.1669 | 0.1034 | 0.0296 |
0.2546 | 21.0 | 1806 | 0.1648 | 0.1018 | 0.0298 |
0.2546 | 22.0 | 1892 | 0.1658 | 0.0975 | 0.0290 |
0.2293 | 23.0 | 1978 | 0.1652 | 0.0973 | 0.0291 |
0.2404 | 24.0 | 2064 | 0.1651 | 0.0943 | 0.0278 |
0.2184 | 25.0 | 2150 | 0.1671 | 0.0933 | 0.0282 |
0.2105 | 26.0 | 2236 | 0.1626 | 0.0950 | 0.0285 |
0.2229 | 27.0 | 2322 | 0.1577 | 0.0957 | 0.0283 |
0.215 | 28.0 | 2408 | 0.1608 | 0.0943 | 0.0280 |
0.215 | 29.0 | 2494 | 0.1633 | 0.0922 | 0.0272 |
0.1959 | 30.0 | 2580 | 0.1567 | 0.0926 | 0.0270 |
0.2024 | 31.0 | 2666 | 0.1645 | 0.0980 | 0.0281 |
0.201 | 32.0 | 2752 | 0.1540 | 0.0917 | 0.0270 |
0.1932 | 33.0 | 2838 | 0.1563 | 0.0940 | 0.0274 |
0.1945 | 34.0 | 2924 | 0.1542 | 0.0954 | 0.0272 |
0.1883 | 35.0 | 3010 | 0.1562 | 0.0947 | 0.0274 |
0.1883 | 36.0 | 3096 | 0.1558 | 0.0945 | 0.0272 |
0.1852 | 37.0 | 3182 | 0.1593 | 0.0891 | 0.0261 |
0.1609 | 38.0 | 3268 | 0.1558 | 0.0957 | 0.0271 |
0.1801 | 39.0 | 3354 | 0.1576 | 0.0917 | 0.0265 |
0.1611 | 40.0 | 3440 | 0.1591 | 0.0917 | 0.0263 |
0.1879 | 41.0 | 3526 | 0.1520 | 0.0903 | 0.0258 |
0.169 | 42.0 | 3612 | 0.1545 | 0.0971 | 0.0269 |
0.169 | 43.0 | 3698 | 0.1553 | 0.0896 | 0.0259 |
0.1676 | 44.0 | 3784 | 0.1574 | 0.0884 | 0.0259 |
0.1512 | 45.0 | 3870 | 0.1577 | 0.0926 | 0.0269 |
0.1457 | 46.0 | 3956 | 0.1611 | 0.0922 | 0.0263 |
0.1498 | 47.0 | 4042 | 0.1569 | 0.0905 | 0.0258 |
0.1848 | 48.0 | 4128 | 0.1560 | 0.0903 | 0.0263 |
0.1455 | 49.0 | 4214 | 0.1584 | 0.0903 | 0.0265 |
0.1549 | 50.0 | 4300 | 0.1566 | 0.0903 | 0.0258 |
0.1549 | 51.0 | 4386 | 0.1513 | 0.0908 | 0.0259 |
0.1454 | 52.0 | 4472 | 0.1513 | 0.0894 | 0.0253 |
0.1482 | 53.0 | 4558 | 0.1553 | 0.0894 | 0.0255 |
0.1407 | 54.0 | 4644 | 0.1535 | 0.0926 | 0.0263 |
0.169 | 55.0 | 4730 | 0.1525 | 0.0912 | 0.0263 |
0.1489 | 56.0 | 4816 | 0.1516 | 0.0910 | 0.0257 |
0.1489 | 57.0 | 4902 | 0.1567 | 0.0889 | 0.0258 |
0.1489 | 58.0 | 4988 | 0.1525 | 0.0894 | 0.0257 |
0.1267 | 59.0 | 5074 | 0.1540 | 0.0865 | 0.0249 |
0.1411 | 60.0 | 5160 | 0.1521 | 0.0908 | 0.0258 |
0.1357 | 61.0 | 5246 | 0.1485 | 0.0910 | 0.0255 |
0.1379 | 62.0 | 5332 | 0.1503 | 0.0898 | 0.0257 |
0.1348 | 63.0 | 5418 | 0.1500 | 0.0898 | 0.0253 |
0.1417 | 64.0 | 5504 | 0.1455 | 0.0896 | 0.0251 |
0.1417 | 65.0 | 5590 | 0.1478 | 0.0880 | 0.0249 |
0.1419 | 66.0 | 5676 | 0.1515 | 0.0884 | 0.0254 |
0.1417 | 67.0 | 5762 | 0.1516 | 0.0894 | 0.0252 |
0.127 | 68.0 | 5848 | 0.1487 | 0.0880 | 0.0250 |
0.1337 | 69.0 | 5934 | 0.1463 | 0.0910 | 0.0256 |
0.1208 | 70.0 | 6020 | 0.1508 | 0.0929 | 0.0256 |
0.127 | 71.0 | 6106 | 0.1586 | 0.0896 | 0.0251 |
0.127 | 72.0 | 6192 | 0.1542 | 0.0905 | 0.0255 |
0.1209 | 73.0 | 6278 | 0.1567 | 0.0891 | 0.0251 |
0.1187 | 74.0 | 6364 | 0.1544 | 0.0915 | 0.0255 |
0.1244 | 75.0 | 6450 | 0.1597 | 0.0908 | 0.0251 |
0.1261 | 76.0 | 6536 | 0.1554 | 0.0887 | 0.0248 |
0.1387 | 77.0 | 6622 | 0.1554 | 0.0889 | 0.0246 |
0.1262 | 78.0 | 6708 | 0.1559 | 0.0901 | 0.0251 |
0.1262 | 79.0 | 6794 | 0.1514 | 0.0887 | 0.0246 |
0.115 | 80.0 | 6880 | 0.1522 | 0.0882 | 0.0246 |
0.1127 | 81.0 | 6966 | 0.1519 | 0.0889 | 0.0247 |
0.1191 | 82.0 | 7052 | 0.1530 | 0.0861 | 0.0243 |
0.116 | 83.0 | 7138 | 0.1535 | 0.0877 | 0.0244 |
0.1217 | 84.0 | 7224 | 0.1528 | 0.0882 | 0.0249 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3