<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mec-ita-coraa-portuguese-all-10
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1731
- Wer: 0.0951
- Cer: 0.0289
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
28.5899 | 1.0 | 86 | 3.2445 | 1.0 | 1.0 |
7.5695 | 2.0 | 172 | 2.9771 | 1.0 | 1.0 |
3.0711 | 3.0 | 258 | 2.9039 | 1.0 | 1.0 |
2.9235 | 4.0 | 344 | 2.8854 | 1.0 | 1.0 |
2.89 | 5.0 | 430 | 2.5992 | 1.0 | 1.0 |
2.3574 | 6.0 | 516 | 0.8211 | 0.3985 | 0.1061 |
1.0258 | 7.0 | 602 | 0.4657 | 0.2145 | 0.0602 |
1.0258 | 8.0 | 688 | 0.3749 | 0.1866 | 0.0522 |
0.7005 | 9.0 | 774 | 0.3151 | 0.1633 | 0.0465 |
0.5395 | 10.0 | 860 | 0.2897 | 0.1609 | 0.0446 |
0.455 | 11.0 | 946 | 0.2762 | 0.1466 | 0.0422 |
0.4392 | 12.0 | 1032 | 0.2520 | 0.1339 | 0.0385 |
0.3718 | 13.0 | 1118 | 0.2440 | 0.1213 | 0.0367 |
0.3695 | 14.0 | 1204 | 0.2308 | 0.1161 | 0.0351 |
0.3695 | 15.0 | 1290 | 0.2337 | 0.1063 | 0.0325 |
0.3303 | 16.0 | 1376 | 0.2242 | 0.1030 | 0.0326 |
0.2946 | 17.0 | 1462 | 0.2272 | 0.1094 | 0.0340 |
0.2853 | 18.0 | 1548 | 0.2115 | 0.1039 | 0.0326 |
0.2617 | 19.0 | 1634 | 0.2169 | 0.1068 | 0.0330 |
0.2765 | 20.0 | 1720 | 0.2141 | 0.1041 | 0.0319 |
0.2528 | 21.0 | 1806 | 0.2126 | 0.1006 | 0.0320 |
0.2528 | 22.0 | 1892 | 0.2047 | 0.1006 | 0.0317 |
0.2469 | 23.0 | 1978 | 0.1953 | 0.0984 | 0.0302 |
0.2325 | 24.0 | 2064 | 0.2028 | 0.1010 | 0.0310 |
0.2316 | 25.0 | 2150 | 0.2001 | 0.0977 | 0.0310 |
0.2274 | 26.0 | 2236 | 0.2024 | 0.0968 | 0.0305 |
0.2111 | 27.0 | 2322 | 0.2044 | 0.1015 | 0.0304 |
0.21 | 28.0 | 2408 | 0.1910 | 0.0946 | 0.0297 |
0.21 | 29.0 | 2494 | 0.1805 | 0.1013 | 0.0302 |
0.2069 | 30.0 | 2580 | 0.1852 | 0.0968 | 0.0301 |
0.1887 | 31.0 | 2666 | 0.1912 | 0.0941 | 0.0295 |
0.1928 | 32.0 | 2752 | 0.1929 | 0.0946 | 0.0299 |
0.1928 | 33.0 | 2838 | 0.1953 | 0.0929 | 0.0296 |
0.1806 | 34.0 | 2924 | 0.1866 | 0.0913 | 0.0284 |
0.1818 | 35.0 | 3010 | 0.1786 | 0.0927 | 0.0285 |
0.1818 | 36.0 | 3096 | 0.1770 | 0.0927 | 0.0289 |
0.1729 | 37.0 | 3182 | 0.1826 | 0.0915 | 0.0279 |
0.1768 | 38.0 | 3268 | 0.1785 | 0.0946 | 0.0281 |
0.1669 | 39.0 | 3354 | 0.1759 | 0.0946 | 0.0287 |
0.1774 | 40.0 | 3440 | 0.1731 | 0.0951 | 0.0289 |
0.1642 | 41.0 | 3526 | 0.1770 | 0.0937 | 0.0276 |
0.153 | 42.0 | 3612 | 0.1828 | 0.0937 | 0.0282 |
0.153 | 43.0 | 3698 | 0.1792 | 0.0949 | 0.0278 |
0.1541 | 44.0 | 3784 | 0.1834 | 0.0896 | 0.0273 |
0.1654 | 45.0 | 3870 | 0.1832 | 0.0927 | 0.0277 |
0.1632 | 46.0 | 3956 | 0.1849 | 0.0903 | 0.0266 |
0.154 | 47.0 | 4042 | 0.1773 | 0.0898 | 0.0268 |
0.1576 | 48.0 | 4128 | 0.1849 | 0.0901 | 0.0278 |
0.1381 | 49.0 | 4214 | 0.1797 | 0.0906 | 0.0269 |
0.151 | 50.0 | 4300 | 0.1811 | 0.0898 | 0.0270 |
0.151 | 51.0 | 4386 | 0.1827 | 0.0910 | 0.0272 |
0.1466 | 52.0 | 4472 | 0.1752 | 0.0915 | 0.0269 |
0.1563 | 53.0 | 4558 | 0.1893 | 0.0877 | 0.0268 |
0.1378 | 54.0 | 4644 | 0.1827 | 0.0865 | 0.0269 |
0.1313 | 55.0 | 4730 | 0.1737 | 0.0846 | 0.0266 |
0.148 | 56.0 | 4816 | 0.1786 | 0.0858 | 0.0269 |
0.1463 | 57.0 | 4902 | 0.1798 | 0.0863 | 0.0266 |
0.1463 | 58.0 | 4988 | 0.1819 | 0.0865 | 0.0266 |
0.1305 | 59.0 | 5074 | 0.1809 | 0.0870 | 0.0272 |
0.1326 | 60.0 | 5160 | 0.1847 | 0.0865 | 0.0271 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3