<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-clean-grade-2-3-5
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1374
- Wer: 0.0836
- Cer: 0.0233
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
33.6318 | 1.0 | 61 | 3.4758 | 1.0 | 1.0 |
7.6296 | 2.0 | 122 | 3.0704 | 1.0 | 1.0 |
7.6296 | 3.0 | 183 | 2.9342 | 1.0 | 1.0 |
3.0164 | 4.0 | 244 | 2.8933 | 1.0 | 1.0 |
2.9234 | 5.0 | 305 | 2.8754 | 1.0 | 1.0 |
2.9234 | 6.0 | 366 | 2.5570 | 1.0 | 1.0 |
2.7136 | 7.0 | 427 | 1.0499 | 0.6659 | 0.1559 |
2.7136 | 8.0 | 488 | 0.5290 | 0.2535 | 0.0639 |
1.3148 | 9.0 | 549 | 0.3913 | 0.1874 | 0.0491 |
0.6976 | 10.0 | 610 | 0.3171 | 0.1568 | 0.0416 |
0.6976 | 11.0 | 671 | 0.2813 | 0.1478 | 0.0386 |
0.5285 | 12.0 | 732 | 0.2586 | 0.1355 | 0.0373 |
0.5285 | 13.0 | 793 | 0.2386 | 0.1254 | 0.0350 |
0.424 | 14.0 | 854 | 0.2157 | 0.1206 | 0.0331 |
0.37 | 15.0 | 915 | 0.2050 | 0.1086 | 0.0296 |
0.37 | 16.0 | 976 | 0.1858 | 0.1060 | 0.0292 |
0.3379 | 17.0 | 1037 | 0.1888 | 0.1012 | 0.0282 |
0.3379 | 18.0 | 1098 | 0.1880 | 0.1008 | 0.0281 |
0.3195 | 19.0 | 1159 | 0.1868 | 0.0978 | 0.0274 |
0.2621 | 20.0 | 1220 | 0.1823 | 0.0929 | 0.0265 |
0.2621 | 21.0 | 1281 | 0.1761 | 0.0911 | 0.0267 |
0.2835 | 22.0 | 1342 | 0.1700 | 0.0918 | 0.0269 |
0.2365 | 23.0 | 1403 | 0.1696 | 0.0944 | 0.0275 |
0.2365 | 24.0 | 1464 | 0.1645 | 0.0911 | 0.0270 |
0.2357 | 25.0 | 1525 | 0.1578 | 0.0900 | 0.0264 |
0.2357 | 26.0 | 1586 | 0.1624 | 0.0911 | 0.0266 |
0.2238 | 27.0 | 1647 | 0.1623 | 0.0915 | 0.0263 |
0.2264 | 28.0 | 1708 | 0.1612 | 0.0959 | 0.0271 |
0.2264 | 29.0 | 1769 | 0.1585 | 0.0896 | 0.0260 |
0.2071 | 30.0 | 1830 | 0.1484 | 0.0956 | 0.0270 |
0.2071 | 31.0 | 1891 | 0.1449 | 0.0888 | 0.0255 |
0.1883 | 32.0 | 1952 | 0.1493 | 0.0885 | 0.0258 |
0.1963 | 33.0 | 2013 | 0.1524 | 0.0903 | 0.0263 |
0.1963 | 34.0 | 2074 | 0.1566 | 0.0859 | 0.0252 |
0.2055 | 35.0 | 2135 | 0.1494 | 0.0881 | 0.0252 |
0.2055 | 36.0 | 2196 | 0.1484 | 0.0900 | 0.0262 |
0.1797 | 37.0 | 2257 | 0.1529 | 0.0870 | 0.0250 |
0.1826 | 38.0 | 2318 | 0.1502 | 0.0877 | 0.0258 |
0.1826 | 39.0 | 2379 | 0.1457 | 0.0862 | 0.0247 |
0.1681 | 40.0 | 2440 | 0.1404 | 0.0840 | 0.0248 |
0.1727 | 41.0 | 2501 | 0.1442 | 0.0844 | 0.0250 |
0.1727 | 42.0 | 2562 | 0.1381 | 0.0870 | 0.0253 |
0.1679 | 43.0 | 2623 | 0.1480 | 0.0877 | 0.0254 |
0.1679 | 44.0 | 2684 | 0.1483 | 0.0862 | 0.0247 |
0.1552 | 45.0 | 2745 | 0.1426 | 0.0836 | 0.0235 |
0.1712 | 46.0 | 2806 | 0.1425 | 0.0859 | 0.0238 |
0.1712 | 47.0 | 2867 | 0.1374 | 0.0836 | 0.0233 |
0.1648 | 48.0 | 2928 | 0.1507 | 0.0844 | 0.0240 |
0.1648 | 49.0 | 2989 | 0.1417 | 0.0836 | 0.0240 |
0.1369 | 50.0 | 3050 | 0.1423 | 0.0851 | 0.0247 |
0.1528 | 51.0 | 3111 | 0.1434 | 0.0896 | 0.0252 |
0.1528 | 52.0 | 3172 | 0.1504 | 0.0900 | 0.0254 |
0.1453 | 53.0 | 3233 | 0.1550 | 0.0844 | 0.0247 |
0.1453 | 54.0 | 3294 | 0.1451 | 0.0873 | 0.0257 |
0.1646 | 55.0 | 3355 | 0.1544 | 0.0892 | 0.0255 |
0.1384 | 56.0 | 3416 | 0.1568 | 0.0892 | 0.0256 |
0.1384 | 57.0 | 3477 | 0.1550 | 0.0859 | 0.0250 |
0.1362 | 58.0 | 3538 | 0.1551 | 0.0862 | 0.0252 |
0.1362 | 59.0 | 3599 | 0.1512 | 0.0832 | 0.0245 |
0.1377 | 60.0 | 3660 | 0.1545 | 0.0866 | 0.0251 |
0.1354 | 61.0 | 3721 | 0.1453 | 0.0844 | 0.0245 |
0.1354 | 62.0 | 3782 | 0.1438 | 0.0840 | 0.0247 |
0.1302 | 63.0 | 3843 | 0.1403 | 0.0873 | 0.0249 |
0.1354 | 64.0 | 3904 | 0.1429 | 0.0847 | 0.0243 |
0.1354 | 65.0 | 3965 | 0.1447 | 0.0836 | 0.0240 |
0.1355 | 66.0 | 4026 | 0.1493 | 0.0829 | 0.0242 |
0.1355 | 67.0 | 4087 | 0.1502 | 0.0836 | 0.0243 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3