<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-10
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1723
- Wer: 0.1003
- Cer: 0.0337
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
27.9102 | 1.0 | 86 | 8.9936 | 0.9998 | 0.9850 |
12.3774 | 2.0 | 172 | 4.2002 | 1.0 | 0.9765 |
6.1033 | 3.0 | 258 | 2.9777 | 1.0 | 1.0 |
3.0498 | 4.0 | 344 | 2.9197 | 1.0 | 1.0 |
2.9326 | 5.0 | 430 | 2.9070 | 1.0 | 1.0 |
2.9013 | 6.0 | 516 | 2.9092 | 1.0 | 1.0 |
2.8884 | 7.0 | 602 | 2.8986 | 1.0 | 1.0 |
2.8884 | 8.0 | 688 | 2.7090 | 1.0 | 0.9999 |
2.846 | 9.0 | 774 | 1.6263 | 1.0 | 0.5386 |
2.2318 | 10.0 | 860 | 0.5529 | 0.2735 | 0.0770 |
1.0214 | 11.0 | 946 | 0.3765 | 0.1925 | 0.0586 |
0.6611 | 12.0 | 1032 | 0.3123 | 0.1703 | 0.0514 |
0.4931 | 13.0 | 1118 | 0.2772 | 0.1523 | 0.0467 |
0.4232 | 14.0 | 1204 | 0.2599 | 0.1284 | 0.0426 |
0.4232 | 15.0 | 1290 | 0.2462 | 0.1216 | 0.0413 |
0.375 | 16.0 | 1376 | 0.2336 | 0.1178 | 0.0410 |
0.3507 | 17.0 | 1462 | 0.2269 | 0.1147 | 0.0406 |
0.3264 | 18.0 | 1548 | 0.2186 | 0.1137 | 0.0394 |
0.3115 | 19.0 | 1634 | 0.2166 | 0.1106 | 0.0386 |
0.2841 | 20.0 | 1720 | 0.2069 | 0.1087 | 0.0379 |
0.2786 | 21.0 | 1806 | 0.2035 | 0.1073 | 0.0381 |
0.2786 | 22.0 | 1892 | 0.2042 | 0.1097 | 0.0374 |
0.2382 | 23.0 | 1978 | 0.1986 | 0.1051 | 0.0364 |
0.2406 | 24.0 | 2064 | 0.1943 | 0.1066 | 0.0368 |
0.2573 | 25.0 | 2150 | 0.1902 | 0.1008 | 0.0359 |
0.2336 | 26.0 | 2236 | 0.1896 | 0.1030 | 0.0360 |
0.2089 | 27.0 | 2322 | 0.1958 | 0.1063 | 0.0368 |
0.2173 | 28.0 | 2408 | 0.1922 | 0.1070 | 0.0370 |
0.2173 | 29.0 | 2494 | 0.1913 | 0.1042 | 0.0366 |
0.2147 | 30.0 | 2580 | 0.1885 | 0.1006 | 0.0357 |
0.2087 | 31.0 | 2666 | 0.1850 | 0.1044 | 0.0359 |
0.1838 | 32.0 | 2752 | 0.1855 | 0.1054 | 0.0365 |
0.1863 | 33.0 | 2838 | 0.1805 | 0.1003 | 0.0355 |
0.1945 | 34.0 | 2924 | 0.1846 | 0.1003 | 0.0351 |
0.1774 | 35.0 | 3010 | 0.1814 | 0.1003 | 0.0353 |
0.1774 | 36.0 | 3096 | 0.1803 | 0.0984 | 0.0344 |
0.1887 | 37.0 | 3182 | 0.1752 | 0.1006 | 0.0344 |
0.164 | 38.0 | 3268 | 0.1812 | 0.1006 | 0.0347 |
0.1829 | 39.0 | 3354 | 0.1777 | 0.0989 | 0.0342 |
0.1838 | 40.0 | 3440 | 0.1793 | 0.1015 | 0.0349 |
0.1574 | 41.0 | 3526 | 0.1773 | 0.1006 | 0.0347 |
0.1597 | 42.0 | 3612 | 0.1723 | 0.1003 | 0.0337 |
0.1597 | 43.0 | 3698 | 0.1757 | 0.0989 | 0.0343 |
0.1548 | 44.0 | 3784 | 0.1727 | 0.1006 | 0.0352 |
0.1651 | 45.0 | 3870 | 0.1789 | 0.0994 | 0.0345 |
0.1604 | 46.0 | 3956 | 0.1762 | 0.0989 | 0.0334 |
0.1366 | 47.0 | 4042 | 0.1759 | 0.0953 | 0.0327 |
0.153 | 48.0 | 4128 | 0.1762 | 0.0982 | 0.0341 |
0.1608 | 49.0 | 4214 | 0.1803 | 0.0972 | 0.0344 |
0.1429 | 50.0 | 4300 | 0.1772 | 0.0965 | 0.0342 |
0.1429 | 51.0 | 4386 | 0.1802 | 0.0989 | 0.0345 |
0.1382 | 52.0 | 4472 | 0.1788 | 0.0963 | 0.0341 |
0.1408 | 53.0 | 4558 | 0.1785 | 0.0960 | 0.0342 |
0.1267 | 54.0 | 4644 | 0.1789 | 0.0979 | 0.0342 |
0.1531 | 55.0 | 4730 | 0.1786 | 0.0960 | 0.0336 |
0.1357 | 56.0 | 4816 | 0.1784 | 0.0943 | 0.0335 |
0.146 | 57.0 | 4902 | 0.1811 | 0.0958 | 0.0337 |
0.146 | 58.0 | 4988 | 0.1813 | 0.0951 | 0.0338 |
0.1363 | 59.0 | 5074 | 0.1813 | 0.0977 | 0.0340 |
0.1366 | 60.0 | 5160 | 0.1785 | 0.0991 | 0.0345 |
0.1502 | 61.0 | 5246 | 0.1789 | 0.0936 | 0.0333 |
0.1261 | 62.0 | 5332 | 0.1771 | 0.0984 | 0.0339 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3