<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-clean-grade-2-3
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1753
- Wer: 0.0862
- Cer: 0.0267
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
36.0093 | 0.99 | 44 | 9.9437 | 0.9994 | 0.9836 |
36.0093 | 2.0 | 89 | 8.4632 | 1.0 | 0.9729 |
13.7616 | 2.99 | 133 | 6.9727 | 0.9994 | 0.9728 |
13.7616 | 4.0 | 178 | 3.3738 | 1.0 | 1.0 |
6.4758 | 4.99 | 222 | 3.0237 | 1.0 | 1.0 |
6.4758 | 6.0 | 267 | 2.9331 | 1.0 | 1.0 |
3.0635 | 6.99 | 311 | 2.9115 | 1.0 | 1.0 |
3.0635 | 8.0 | 356 | 2.8971 | 1.0 | 1.0 |
2.9319 | 8.99 | 400 | 2.9100 | 1.0 | 1.0 |
2.9319 | 10.0 | 445 | 2.8852 | 1.0 | 1.0 |
2.9319 | 10.99 | 489 | 2.8903 | 1.0 | 1.0 |
2.9173 | 12.0 | 534 | 2.8956 | 1.0 | 1.0 |
2.9173 | 12.99 | 578 | 2.8790 | 1.0 | 1.0 |
2.9067 | 14.0 | 623 | 2.8663 | 1.0 | 1.0 |
2.9067 | 14.99 | 667 | 2.8470 | 1.0 | 0.9998 |
2.8703 | 16.0 | 712 | 2.7137 | 1.0 | 0.9986 |
2.8703 | 16.99 | 756 | 2.4736 | 0.9994 | 0.9969 |
2.6103 | 18.0 | 801 | 2.0185 | 1.0 | 0.8277 |
2.6103 | 18.99 | 845 | 1.2017 | 1.0 | 0.3969 |
2.6103 | 20.0 | 890 | 0.7261 | 0.9883 | 0.2400 |
1.5518 | 20.99 | 934 | 0.4761 | 0.2135 | 0.0540 |
1.5518 | 22.0 | 979 | 0.3738 | 0.1918 | 0.0493 |
0.7306 | 22.99 | 1023 | 0.3266 | 0.1601 | 0.0427 |
0.7306 | 24.0 | 1068 | 0.2982 | 0.1440 | 0.0386 |
0.4936 | 24.99 | 1112 | 0.2840 | 0.1295 | 0.0366 |
0.4936 | 26.0 | 1157 | 0.2673 | 0.1329 | 0.0362 |
0.4068 | 26.99 | 1201 | 0.2546 | 0.1106 | 0.0323 |
0.4068 | 28.0 | 1246 | 0.2390 | 0.1123 | 0.0324 |
0.4068 | 28.99 | 1290 | 0.2367 | 0.1084 | 0.0316 |
0.3391 | 30.0 | 1335 | 0.2301 | 0.1078 | 0.0314 |
0.3391 | 30.99 | 1379 | 0.2266 | 0.1078 | 0.0319 |
0.3141 | 32.0 | 1424 | 0.2189 | 0.1084 | 0.0314 |
0.3141 | 32.99 | 1468 | 0.2175 | 0.1056 | 0.0310 |
0.293 | 34.0 | 1513 | 0.2145 | 0.1073 | 0.0324 |
0.293 | 34.99 | 1557 | 0.2113 | 0.1023 | 0.0306 |
0.2564 | 36.0 | 1602 | 0.2053 | 0.1051 | 0.0317 |
0.2564 | 36.99 | 1646 | 0.2045 | 0.1045 | 0.0300 |
0.2564 | 38.0 | 1691 | 0.2116 | 0.0984 | 0.0304 |
0.2458 | 38.99 | 1735 | 0.1946 | 0.0984 | 0.0292 |
0.2458 | 40.0 | 1780 | 0.2014 | 0.0951 | 0.0291 |
0.2284 | 40.99 | 1824 | 0.1946 | 0.0962 | 0.0290 |
0.2284 | 42.0 | 1869 | 0.1926 | 0.0978 | 0.0285 |
0.2069 | 42.99 | 1913 | 0.1910 | 0.0984 | 0.0280 |
0.2069 | 44.0 | 1958 | 0.1908 | 0.0973 | 0.0285 |
0.216 | 44.99 | 2002 | 0.2017 | 0.0951 | 0.0287 |
0.216 | 46.0 | 2047 | 0.1900 | 0.0917 | 0.0277 |
0.216 | 46.99 | 2091 | 0.1887 | 0.0906 | 0.0276 |
0.1947 | 48.0 | 2136 | 0.1868 | 0.0912 | 0.0279 |
0.1947 | 48.99 | 2180 | 0.1936 | 0.0945 | 0.0281 |
0.2015 | 50.0 | 2225 | 0.1903 | 0.0917 | 0.0278 |
0.2015 | 50.99 | 2269 | 0.1797 | 0.0889 | 0.0266 |
0.1845 | 52.0 | 2314 | 0.1859 | 0.0895 | 0.0275 |
0.1845 | 52.99 | 2358 | 0.1868 | 0.0934 | 0.0281 |
0.1845 | 54.0 | 2403 | 0.1895 | 0.0912 | 0.0275 |
0.1845 | 54.99 | 2447 | 0.1862 | 0.0945 | 0.0280 |
0.1845 | 56.0 | 2492 | 0.1858 | 0.0862 | 0.0269 |
0.1875 | 56.99 | 2536 | 0.1821 | 0.0928 | 0.0280 |
0.1875 | 58.0 | 2581 | 0.1873 | 0.0901 | 0.0279 |
0.1563 | 58.99 | 2625 | 0.1833 | 0.0923 | 0.0277 |
0.1563 | 60.0 | 2670 | 0.1854 | 0.0912 | 0.0279 |
0.1605 | 60.99 | 2714 | 0.1889 | 0.0906 | 0.0273 |
0.1605 | 62.0 | 2759 | 0.1843 | 0.0884 | 0.0275 |
0.1744 | 62.99 | 2803 | 0.1849 | 0.0901 | 0.0280 |
0.1744 | 64.0 | 2848 | 0.1821 | 0.0895 | 0.0278 |
0.1744 | 64.99 | 2892 | 0.1842 | 0.0923 | 0.0279 |
0.1541 | 66.0 | 2937 | 0.1825 | 0.0873 | 0.0269 |
0.1541 | 66.99 | 2981 | 0.1825 | 0.0856 | 0.0265 |
0.139 | 68.0 | 3026 | 0.1815 | 0.0884 | 0.0272 |
0.139 | 68.99 | 3070 | 0.1812 | 0.0862 | 0.0266 |
0.1402 | 70.0 | 3115 | 0.1797 | 0.0867 | 0.0267 |
0.1402 | 70.99 | 3159 | 0.1818 | 0.0873 | 0.0271 |
0.1481 | 72.0 | 3204 | 0.1808 | 0.0856 | 0.0260 |
0.1481 | 72.99 | 3248 | 0.1790 | 0.0867 | 0.0265 |
0.1481 | 74.0 | 3293 | 0.1796 | 0.0867 | 0.0267 |
0.1439 | 74.99 | 3337 | 0.1826 | 0.0850 | 0.0262 |
0.1439 | 76.0 | 3382 | 0.1811 | 0.0839 | 0.0260 |
0.1432 | 76.99 | 3426 | 0.1788 | 0.0884 | 0.0264 |
0.1432 | 78.0 | 3471 | 0.1753 | 0.0862 | 0.0267 |
0.1464 | 78.99 | 3515 | 0.1789 | 0.0867 | 0.0264 |
0.1464 | 80.0 | 3560 | 0.1803 | 0.0839 | 0.0264 |
0.1336 | 80.99 | 3604 | 0.1795 | 0.0889 | 0.0277 |
0.1336 | 82.0 | 3649 | 0.1824 | 0.0867 | 0.0270 |
0.1336 | 82.99 | 3693 | 0.1821 | 0.0856 | 0.0269 |
0.1344 | 84.0 | 3738 | 0.1825 | 0.0850 | 0.0269 |
0.1344 | 84.99 | 3782 | 0.1820 | 0.0839 | 0.0267 |
0.1251 | 86.0 | 3827 | 0.1787 | 0.0850 | 0.0268 |
0.1251 | 86.99 | 3871 | 0.1787 | 0.0878 | 0.0271 |
0.1211 | 88.0 | 3916 | 0.1818 | 0.0867 | 0.0267 |
0.1211 | 88.99 | 3960 | 0.1820 | 0.0845 | 0.0265 |
0.1344 | 90.0 | 4005 | 0.1822 | 0.0873 | 0.0271 |
0.1344 | 90.99 | 4049 | 0.1817 | 0.0856 | 0.0268 |
0.1344 | 92.0 | 4094 | 0.1813 | 0.0862 | 0.0272 |
0.131 | 92.99 | 4138 | 0.1813 | 0.0845 | 0.0272 |
0.131 | 94.0 | 4183 | 0.1818 | 0.0862 | 0.0270 |
0.131 | 94.99 | 4227 | 0.1815 | 0.0878 | 0.0270 |
0.131 | 96.0 | 4272 | 0.1811 | 0.0895 | 0.0275 |
0.1191 | 96.99 | 4316 | 0.1812 | 0.0862 | 0.0269 |
0.1191 | 98.0 | 4361 | 0.1819 | 0.0856 | 0.0270 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3