<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-grade-2-3
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1655
- Wer: 0.0928
- Cer: 0.0272
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
32.9926 | 0.99 | 61 | 3.7101 | 1.0 | 1.0 |
9.1028 | 2.0 | 123 | 3.1238 | 1.0 | 1.0 |
9.1028 | 2.99 | 184 | 2.9597 | 1.0 | 1.0 |
3.0928 | 4.0 | 246 | 2.9281 | 1.0 | 1.0 |
2.9415 | 4.99 | 307 | 2.9094 | 1.0 | 1.0 |
2.9415 | 6.0 | 369 | 2.8923 | 1.0 | 1.0 |
2.9029 | 6.99 | 430 | 2.5645 | 1.0 | 1.0 |
2.9029 | 8.0 | 492 | 1.1522 | 0.9921 | 0.2906 |
2.302 | 8.99 | 553 | 0.6203 | 0.2580 | 0.0701 |
1.0512 | 10.0 | 615 | 0.4373 | 0.1918 | 0.0529 |
1.0512 | 10.99 | 676 | 0.3711 | 0.1760 | 0.0498 |
0.655 | 12.0 | 738 | 0.3171 | 0.1606 | 0.0450 |
0.655 | 12.99 | 799 | 0.3067 | 0.1448 | 0.0428 |
0.4975 | 14.0 | 861 | 0.2770 | 0.1357 | 0.0395 |
0.4334 | 14.99 | 922 | 0.2874 | 0.1290 | 0.0399 |
0.4334 | 16.0 | 984 | 0.2629 | 0.1294 | 0.0369 |
0.4175 | 16.99 | 1045 | 0.2494 | 0.1169 | 0.0357 |
0.3582 | 18.0 | 1107 | 0.2366 | 0.1132 | 0.0335 |
0.3582 | 18.99 | 1168 | 0.2265 | 0.1082 | 0.0329 |
0.3218 | 20.0 | 1230 | 0.2120 | 0.1124 | 0.0335 |
0.3218 | 20.99 | 1291 | 0.2079 | 0.1065 | 0.0325 |
0.3106 | 22.0 | 1353 | 0.1965 | 0.1020 | 0.0304 |
0.2753 | 22.99 | 1414 | 0.1886 | 0.0974 | 0.0293 |
0.2753 | 24.0 | 1476 | 0.1891 | 0.1032 | 0.0312 |
0.2554 | 24.99 | 1537 | 0.1897 | 0.0990 | 0.0305 |
0.2554 | 26.0 | 1599 | 0.1901 | 0.1007 | 0.0315 |
0.2442 | 26.99 | 1660 | 0.1895 | 0.1007 | 0.0311 |
0.2443 | 28.0 | 1722 | 0.1814 | 0.0999 | 0.0306 |
0.2443 | 28.99 | 1783 | 0.1785 | 0.0961 | 0.0290 |
0.2193 | 30.0 | 1845 | 0.1788 | 0.0949 | 0.0292 |
0.2128 | 30.99 | 1906 | 0.1761 | 0.0961 | 0.0289 |
0.2128 | 32.0 | 1968 | 0.1773 | 0.1003 | 0.0301 |
0.1979 | 32.99 | 2029 | 0.1737 | 0.0974 | 0.0289 |
0.1979 | 34.0 | 2091 | 0.1742 | 0.1011 | 0.0284 |
0.1973 | 34.99 | 2152 | 0.1710 | 0.0916 | 0.0274 |
0.184 | 36.0 | 2214 | 0.1729 | 0.0940 | 0.0276 |
0.184 | 36.99 | 2275 | 0.1763 | 0.0961 | 0.0299 |
0.1696 | 38.0 | 2337 | 0.1733 | 0.0903 | 0.0279 |
0.1696 | 38.99 | 2398 | 0.1719 | 0.0920 | 0.0275 |
0.1881 | 40.0 | 2460 | 0.1663 | 0.0870 | 0.0264 |
0.165 | 40.99 | 2521 | 0.1681 | 0.0907 | 0.0275 |
0.165 | 42.0 | 2583 | 0.1655 | 0.0928 | 0.0272 |
0.1777 | 42.99 | 2644 | 0.1680 | 0.0895 | 0.0271 |
0.1688 | 44.0 | 2706 | 0.1672 | 0.0899 | 0.0276 |
0.1688 | 44.99 | 2767 | 0.1719 | 0.0928 | 0.0289 |
0.166 | 46.0 | 2829 | 0.1744 | 0.0916 | 0.0281 |
0.166 | 46.99 | 2890 | 0.1739 | 0.0874 | 0.0267 |
0.1482 | 48.0 | 2952 | 0.1724 | 0.0857 | 0.0265 |
0.1588 | 48.99 | 3013 | 0.1748 | 0.0903 | 0.0277 |
0.1588 | 50.0 | 3075 | 0.1708 | 0.0907 | 0.0268 |
0.1409 | 50.99 | 3136 | 0.1750 | 0.0891 | 0.0269 |
0.1409 | 52.0 | 3198 | 0.1730 | 0.0891 | 0.0265 |
0.1467 | 52.99 | 3259 | 0.1732 | 0.0899 | 0.0265 |
0.1555 | 54.0 | 3321 | 0.1710 | 0.0882 | 0.0270 |
0.1555 | 54.99 | 3382 | 0.1709 | 0.0866 | 0.0265 |
0.1419 | 56.0 | 3444 | 0.1722 | 0.0886 | 0.0270 |
0.129 | 56.99 | 3505 | 0.1718 | 0.0899 | 0.0267 |
0.129 | 58.0 | 3567 | 0.1700 | 0.0861 | 0.0263 |
0.1435 | 58.99 | 3628 | 0.1705 | 0.0845 | 0.0261 |
0.1435 | 60.0 | 3690 | 0.1683 | 0.0886 | 0.0265 |
0.1389 | 60.99 | 3751 | 0.1702 | 0.0845 | 0.0261 |
0.1295 | 62.0 | 3813 | 0.1731 | 0.0853 | 0.0261 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3