<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-clean-01
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1551
- Wer: 0.0980
- Cer: 0.0274
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
28.9237 | 1.0 | 67 | 9.6707 | 1.0 | 0.9866 |
12.6709 | 2.0 | 134 | 3.2494 | 1.0 | 1.0 |
3.5707 | 3.0 | 201 | 2.9468 | 1.0 | 1.0 |
3.5707 | 4.0 | 268 | 2.9034 | 1.0 | 1.0 |
2.9301 | 5.0 | 335 | 2.9156 | 1.0 | 1.0 |
2.8833 | 6.0 | 402 | 2.7990 | 1.0 | 1.0 |
2.8833 | 7.0 | 469 | 2.1014 | 0.9969 | 0.9049 |
2.4442 | 8.0 | 536 | 0.7351 | 0.5621 | 0.1277 |
1.092 | 9.0 | 603 | 0.4375 | 0.2230 | 0.0596 |
1.092 | 10.0 | 670 | 0.3387 | 0.1916 | 0.0521 |
0.6709 | 11.0 | 737 | 0.3004 | 0.1727 | 0.0468 |
0.5134 | 12.0 | 804 | 0.2680 | 0.1612 | 0.0447 |
0.5134 | 13.0 | 871 | 0.2513 | 0.1413 | 0.0396 |
0.4219 | 14.0 | 938 | 0.2376 | 0.1315 | 0.0373 |
0.3635 | 15.0 | 1005 | 0.2272 | 0.1298 | 0.0365 |
0.3635 | 16.0 | 1072 | 0.2160 | 0.1301 | 0.0355 |
0.3612 | 17.0 | 1139 | 0.2079 | 0.1183 | 0.0339 |
0.3203 | 18.0 | 1206 | 0.2084 | 0.1165 | 0.0335 |
0.3203 | 19.0 | 1273 | 0.1959 | 0.1137 | 0.0318 |
0.2709 | 20.0 | 1340 | 0.1930 | 0.1099 | 0.0308 |
0.2879 | 21.0 | 1407 | 0.1952 | 0.1096 | 0.0312 |
0.2879 | 22.0 | 1474 | 0.1923 | 0.1096 | 0.0314 |
0.2493 | 23.0 | 1541 | 0.1877 | 0.1078 | 0.0315 |
0.2349 | 24.0 | 1608 | 0.1761 | 0.1085 | 0.0316 |
0.2349 | 25.0 | 1675 | 0.1789 | 0.1085 | 0.0310 |
0.2401 | 26.0 | 1742 | 0.1766 | 0.1005 | 0.0298 |
0.235 | 27.0 | 1809 | 0.1730 | 0.1019 | 0.0297 |
0.235 | 28.0 | 1876 | 0.1836 | 0.1061 | 0.0304 |
0.2073 | 29.0 | 1943 | 0.1726 | 0.1050 | 0.0295 |
0.1965 | 30.0 | 2010 | 0.1795 | 0.1008 | 0.0294 |
0.1965 | 31.0 | 2077 | 0.1822 | 0.1022 | 0.0297 |
0.2129 | 32.0 | 2144 | 0.1686 | 0.1005 | 0.0286 |
0.1918 | 33.0 | 2211 | 0.1575 | 0.0998 | 0.0282 |
0.1918 | 34.0 | 2278 | 0.1687 | 0.1022 | 0.0292 |
0.1974 | 35.0 | 2345 | 0.1576 | 0.1019 | 0.0281 |
0.1935 | 36.0 | 2412 | 0.1626 | 0.0998 | 0.0281 |
0.1935 | 37.0 | 2479 | 0.1572 | 0.0970 | 0.0282 |
0.1881 | 38.0 | 2546 | 0.1552 | 0.0984 | 0.0278 |
0.1743 | 39.0 | 2613 | 0.1692 | 0.1001 | 0.0289 |
0.1743 | 40.0 | 2680 | 0.1582 | 0.0970 | 0.0282 |
0.1667 | 41.0 | 2747 | 0.1616 | 0.0991 | 0.0288 |
0.1588 | 42.0 | 2814 | 0.1729 | 0.0984 | 0.0294 |
0.1588 | 43.0 | 2881 | 0.1601 | 0.0956 | 0.0282 |
0.1834 | 44.0 | 2948 | 0.1664 | 0.0942 | 0.0275 |
0.1769 | 45.0 | 3015 | 0.1578 | 0.1015 | 0.0282 |
0.1769 | 46.0 | 3082 | 0.1612 | 0.0956 | 0.0276 |
0.1677 | 47.0 | 3149 | 0.1551 | 0.0980 | 0.0274 |
0.1515 | 48.0 | 3216 | 0.1552 | 0.0956 | 0.0271 |
0.1515 | 49.0 | 3283 | 0.1554 | 0.0973 | 0.0277 |
0.1558 | 50.0 | 3350 | 0.1631 | 0.0977 | 0.0282 |
0.1566 | 51.0 | 3417 | 0.1657 | 0.0984 | 0.0281 |
0.1566 | 52.0 | 3484 | 0.1680 | 0.0984 | 0.0278 |
0.1384 | 53.0 | 3551 | 0.1689 | 0.0977 | 0.0282 |
0.1414 | 54.0 | 3618 | 0.1688 | 0.0980 | 0.0280 |
0.1414 | 55.0 | 3685 | 0.1667 | 0.0980 | 0.0279 |
0.1357 | 56.0 | 3752 | 0.1693 | 0.0973 | 0.0277 |
0.1473 | 57.0 | 3819 | 0.1644 | 0.0960 | 0.0277 |
0.1473 | 58.0 | 3886 | 0.1695 | 0.0980 | 0.0281 |
0.138 | 59.0 | 3953 | 0.1750 | 0.0998 | 0.0284 |
0.1319 | 60.0 | 4020 | 0.1707 | 0.0977 | 0.0279 |
0.1319 | 61.0 | 4087 | 0.1705 | 0.0984 | 0.0277 |
0.1381 | 62.0 | 4154 | 0.1590 | 0.1001 | 0.0278 |
0.1278 | 63.0 | 4221 | 0.1656 | 0.1012 | 0.0281 |
0.1278 | 64.0 | 4288 | 0.1658 | 0.0967 | 0.0277 |
0.1387 | 65.0 | 4355 | 0.1680 | 0.1001 | 0.0284 |
0.1342 | 66.0 | 4422 | 0.1689 | 0.0967 | 0.0277 |
0.1342 | 67.0 | 4489 | 0.1736 | 0.0980 | 0.0281 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3