<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-07
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1508
- Wer: 0.1043
- Cer: 0.0313
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
29.285 | 1.0 | 86 | 4.4517 | 1.0 | 1.0 |
9.6798 | 2.0 | 172 | 3.2787 | 0.9687 | 0.9478 |
4.4299 | 3.0 | 258 | 3.1545 | 0.9755 | 0.9807 |
4.1881 | 4.0 | 344 | 2.9177 | 1.0 | 1.0 |
4.2218 | 5.0 | 430 | 2.9044 | 1.0 | 1.0 |
3.3289 | 6.0 | 516 | 2.9194 | 0.9679 | 0.9709 |
4.2916 | 7.0 | 602 | 2.8598 | 0.9563 | 0.9188 |
4.2916 | 8.0 | 688 | 2.9310 | 0.9691 | 0.9357 |
3.9718 | 9.0 | 774 | 2.8625 | 0.9589 | 0.9022 |
3.7729 | 10.0 | 860 | 2.7969 | 0.9597 | 0.8810 |
3.9658 | 11.0 | 946 | 2.8245 | 0.9623 | 0.8708 |
3.9463 | 12.0 | 1032 | 2.7623 | 0.9813 | 0.8284 |
3.7464 | 13.0 | 1118 | 2.7775 | 0.9687 | 0.8157 |
3.4367 | 14.0 | 1204 | 2.7377 | 0.9827 | 0.7907 |
3.4367 | 15.0 | 1290 | 2.7610 | 0.9762 | 0.7866 |
3.7793 | 16.0 | 1376 | 2.7348 | 0.9779 | 0.7686 |
3.5917 | 17.0 | 1462 | 2.7264 | 0.9772 | 0.7630 |
3.6592 | 18.0 | 1548 | 2.8368 | 0.9682 | 0.7726 |
3.6796 | 19.0 | 1634 | 2.6872 | 0.9691 | 0.7746 |
3.6374 | 20.0 | 1720 | 2.6675 | 0.9832 | 0.7497 |
3.5981 | 21.0 | 1806 | 2.6653 | 0.9813 | 0.7553 |
3.5981 | 22.0 | 1892 | 2.6524 | 0.9631 | 0.7620 |
3.3592 | 23.0 | 1978 | 2.6218 | 0.9764 | 0.7477 |
3.3905 | 24.0 | 2064 | 2.6224 | 0.9759 | 0.7319 |
3.7128 | 25.0 | 2150 | 2.5871 | 0.9747 | 0.7238 |
3.1975 | 26.0 | 2236 | 2.5861 | 0.9679 | 0.7138 |
3.1143 | 27.0 | 2322 | 2.6368 | 0.9623 | 0.7400 |
3.3341 | 28.0 | 2408 | 2.5720 | 0.9643 | 0.7422 |
3.3341 | 29.0 | 2494 | 2.5195 | 0.9572 | 0.7281 |
3.1484 | 30.0 | 2580 | 2.4978 | 0.9604 | 0.7111 |
3.0853 | 31.0 | 2666 | 2.4664 | 0.9575 | 0.7383 |
3.1276 | 32.0 | 2752 | 2.3562 | 0.9609 | 0.7277 |
3.0271 | 33.0 | 2838 | 2.3270 | 0.9633 | 0.7301 |
2.6923 | 34.0 | 2924 | 2.1691 | 0.9694 | 0.6807 |
2.219 | 35.0 | 3010 | 1.4645 | 0.9589 | 0.4523 |
2.219 | 36.0 | 3096 | 0.8696 | 0.7098 | 0.2029 |
1.5882 | 37.0 | 3182 | 0.5418 | 0.3871 | 0.1043 |
1.0408 | 38.0 | 3268 | 0.3795 | 0.2535 | 0.0727 |
0.7398 | 39.0 | 3354 | 0.3286 | 0.2129 | 0.0623 |
0.6218 | 40.0 | 3440 | 0.2955 | 0.1905 | 0.0563 |
0.5638 | 41.0 | 3526 | 0.2789 | 0.1830 | 0.0536 |
0.4704 | 42.0 | 3612 | 0.2561 | 0.1570 | 0.0480 |
0.4704 | 43.0 | 3698 | 0.2428 | 0.1519 | 0.0464 |
0.43 | 44.0 | 3784 | 0.2288 | 0.1400 | 0.0431 |
0.4286 | 45.0 | 3870 | 0.2181 | 0.1366 | 0.0412 |
0.3898 | 46.0 | 3956 | 0.2135 | 0.1290 | 0.0406 |
0.3693 | 47.0 | 4042 | 0.2044 | 0.1339 | 0.0399 |
0.3608 | 48.0 | 4128 | 0.1976 | 0.1286 | 0.0389 |
0.3557 | 49.0 | 4214 | 0.1996 | 0.1276 | 0.0385 |
0.3065 | 50.0 | 4300 | 0.1934 | 0.1278 | 0.0376 |
0.3065 | 51.0 | 4386 | 0.1894 | 0.1235 | 0.0373 |
0.3169 | 52.0 | 4472 | 0.1845 | 0.1220 | 0.0368 |
0.305 | 53.0 | 4558 | 0.1789 | 0.1191 | 0.0358 |
0.2883 | 54.0 | 4644 | 0.1837 | 0.1162 | 0.0352 |
0.2888 | 55.0 | 4730 | 0.1843 | 0.1152 | 0.0355 |
0.2871 | 56.0 | 4816 | 0.1782 | 0.1152 | 0.0349 |
0.2616 | 57.0 | 4902 | 0.1695 | 0.1120 | 0.0342 |
0.2616 | 58.0 | 4988 | 0.1737 | 0.1115 | 0.0336 |
0.2768 | 59.0 | 5074 | 0.1713 | 0.1130 | 0.0340 |
0.2649 | 60.0 | 5160 | 0.1605 | 0.1098 | 0.0332 |
0.2633 | 61.0 | 5246 | 0.1652 | 0.1079 | 0.0326 |
0.2471 | 62.0 | 5332 | 0.1635 | 0.1074 | 0.0324 |
0.2488 | 63.0 | 5418 | 0.1596 | 0.1096 | 0.0331 |
0.2445 | 64.0 | 5504 | 0.1609 | 0.1043 | 0.0313 |
0.2445 | 65.0 | 5590 | 0.1646 | 0.1074 | 0.0324 |
0.2443 | 66.0 | 5676 | 0.1557 | 0.1079 | 0.0323 |
0.2242 | 67.0 | 5762 | 0.1616 | 0.1055 | 0.0316 |
0.233 | 68.0 | 5848 | 0.1589 | 0.1057 | 0.0319 |
0.2342 | 69.0 | 5934 | 0.1617 | 0.1069 | 0.0321 |
0.2042 | 70.0 | 6020 | 0.1548 | 0.1067 | 0.0321 |
0.2193 | 71.0 | 6106 | 0.1559 | 0.1074 | 0.0319 |
0.2193 | 72.0 | 6192 | 0.1529 | 0.1094 | 0.0328 |
0.2155 | 73.0 | 6278 | 0.1526 | 0.1050 | 0.0313 |
0.2071 | 74.0 | 6364 | 0.1525 | 0.1062 | 0.0315 |
0.2211 | 75.0 | 6450 | 0.1517 | 0.1033 | 0.0310 |
0.2111 | 76.0 | 6536 | 0.1525 | 0.1040 | 0.0311 |
0.2222 | 77.0 | 6622 | 0.1538 | 0.1033 | 0.0307 |
0.2277 | 78.0 | 6708 | 0.1508 | 0.1043 | 0.0313 |
0.2277 | 79.0 | 6794 | 0.1514 | 0.1062 | 0.0314 |
0.2073 | 80.0 | 6880 | 0.1531 | 0.1050 | 0.0314 |
0.1858 | 81.0 | 6966 | 0.1516 | 0.1045 | 0.0314 |
0.2002 | 82.0 | 7052 | 0.1538 | 0.1030 | 0.0307 |
0.1882 | 83.0 | 7138 | 0.1519 | 0.1026 | 0.0309 |
0.2001 | 84.0 | 7224 | 0.1556 | 0.1023 | 0.0309 |
0.1855 | 85.0 | 7310 | 0.1525 | 0.1023 | 0.0307 |
0.1855 | 86.0 | 7396 | 0.1527 | 0.1023 | 0.0312 |
0.201 | 87.0 | 7482 | 0.1510 | 0.1050 | 0.0315 |
0.1997 | 88.0 | 7568 | 0.1524 | 0.1026 | 0.0314 |
0.1925 | 89.0 | 7654 | 0.1525 | 0.1040 | 0.0317 |
0.1969 | 90.0 | 7740 | 0.1554 | 0.1033 | 0.0316 |
0.1852 | 91.0 | 7826 | 0.1552 | 0.1038 | 0.0315 |
0.1922 | 92.0 | 7912 | 0.1540 | 0.1026 | 0.0310 |
0.1922 | 93.0 | 7998 | 0.1523 | 0.1030 | 0.0313 |
0.1842 | 94.0 | 8084 | 0.1527 | 0.1023 | 0.0310 |
0.1839 | 95.0 | 8170 | 0.1514 | 0.1013 | 0.0309 |
0.189 | 96.0 | 8256 | 0.1515 | 0.1018 | 0.0310 |
0.1945 | 97.0 | 8342 | 0.1528 | 0.1018 | 0.0310 |
0.1766 | 98.0 | 8428 | 0.1515 | 0.1026 | 0.0312 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3