<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-01
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1409
- Wer: 0.0881
- Cer: 0.0252
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
31.6902 | 1.0 | 86 | 3.2464 | 1.0 | 1.0 |
7.8893 | 2.0 | 172 | 3.0260 | 1.0 | 1.0 |
3.0632 | 3.0 | 258 | 2.9107 | 1.0 | 1.0 |
2.9359 | 4.0 | 344 | 2.9192 | 1.0 | 1.0 |
2.8959 | 5.0 | 430 | 2.6958 | 1.0 | 1.0 |
2.4935 | 6.0 | 516 | 0.9688 | 0.7493 | 0.1726 |
1.1534 | 7.0 | 602 | 0.4912 | 0.2384 | 0.0645 |
1.1534 | 8.0 | 688 | 0.3807 | 0.1956 | 0.0539 |
0.7254 | 9.0 | 774 | 0.3121 | 0.1735 | 0.0471 |
0.5771 | 10.0 | 860 | 0.2682 | 0.1609 | 0.0437 |
0.4772 | 11.0 | 946 | 0.2458 | 0.1555 | 0.0418 |
0.4056 | 12.0 | 1032 | 0.2293 | 0.1405 | 0.0381 |
0.3781 | 13.0 | 1118 | 0.2294 | 0.1336 | 0.0376 |
0.3607 | 14.0 | 1204 | 0.2177 | 0.1228 | 0.0350 |
0.3607 | 15.0 | 1290 | 0.2010 | 0.1159 | 0.0328 |
0.3061 | 16.0 | 1376 | 0.1963 | 0.1134 | 0.0329 |
0.3163 | 17.0 | 1462 | 0.1901 | 0.1166 | 0.0323 |
0.3313 | 18.0 | 1548 | 0.1837 | 0.1102 | 0.0312 |
0.2752 | 19.0 | 1634 | 0.1742 | 0.1078 | 0.0306 |
0.2819 | 20.0 | 1720 | 0.1742 | 0.1019 | 0.0293 |
0.2445 | 21.0 | 1806 | 0.1705 | 0.1083 | 0.0304 |
0.2445 | 22.0 | 1892 | 0.1700 | 0.1026 | 0.0297 |
0.2384 | 23.0 | 1978 | 0.1676 | 0.1038 | 0.0294 |
0.2468 | 24.0 | 2064 | 0.1627 | 0.1014 | 0.0285 |
0.2409 | 25.0 | 2150 | 0.1625 | 0.1011 | 0.0283 |
0.236 | 26.0 | 2236 | 0.1605 | 0.0992 | 0.0278 |
0.2264 | 27.0 | 2322 | 0.1587 | 0.0977 | 0.0281 |
0.2326 | 28.0 | 2408 | 0.1583 | 0.0982 | 0.0280 |
0.2326 | 29.0 | 2494 | 0.1592 | 0.0972 | 0.0277 |
0.2206 | 30.0 | 2580 | 0.1595 | 0.0987 | 0.0285 |
0.2035 | 31.0 | 2666 | 0.1536 | 0.0974 | 0.0276 |
0.2156 | 32.0 | 2752 | 0.1544 | 0.0950 | 0.0272 |
0.1936 | 33.0 | 2838 | 0.1558 | 0.0945 | 0.0266 |
0.1846 | 34.0 | 2924 | 0.1551 | 0.0923 | 0.0262 |
0.1865 | 35.0 | 3010 | 0.1535 | 0.0972 | 0.0275 |
0.1865 | 36.0 | 3096 | 0.1513 | 0.0979 | 0.0280 |
0.1807 | 37.0 | 3182 | 0.1517 | 0.0925 | 0.0268 |
0.1745 | 38.0 | 3268 | 0.1496 | 0.0955 | 0.0271 |
0.1745 | 39.0 | 3354 | 0.1490 | 0.0952 | 0.0261 |
0.1664 | 40.0 | 3440 | 0.1465 | 0.0955 | 0.0268 |
0.192 | 41.0 | 3526 | 0.1486 | 0.0965 | 0.0273 |
0.1683 | 42.0 | 3612 | 0.1452 | 0.0928 | 0.0266 |
0.1683 | 43.0 | 3698 | 0.1458 | 0.0906 | 0.0257 |
0.1906 | 44.0 | 3784 | 0.1515 | 0.0920 | 0.0272 |
0.1749 | 45.0 | 3870 | 0.1479 | 0.0935 | 0.0271 |
0.1563 | 46.0 | 3956 | 0.1477 | 0.0908 | 0.0262 |
0.1637 | 47.0 | 4042 | 0.1477 | 0.0930 | 0.0267 |
0.1618 | 48.0 | 4128 | 0.1409 | 0.0881 | 0.0252 |
0.1636 | 49.0 | 4214 | 0.1438 | 0.0938 | 0.0257 |
0.1624 | 50.0 | 4300 | 0.1434 | 0.0898 | 0.0252 |
0.1624 | 51.0 | 4386 | 0.1474 | 0.0920 | 0.0257 |
0.15 | 52.0 | 4472 | 0.1466 | 0.0883 | 0.0252 |
0.1518 | 53.0 | 4558 | 0.1524 | 0.0955 | 0.0265 |
0.1486 | 54.0 | 4644 | 0.1468 | 0.0896 | 0.0252 |
0.1614 | 55.0 | 4730 | 0.1430 | 0.0906 | 0.0253 |
0.1475 | 56.0 | 4816 | 0.1451 | 0.0898 | 0.0257 |
0.1421 | 57.0 | 4902 | 0.1432 | 0.0906 | 0.0259 |
0.1421 | 58.0 | 4988 | 0.1435 | 0.0881 | 0.0258 |
0.1419 | 59.0 | 5074 | 0.1462 | 0.0901 | 0.0259 |
0.1433 | 60.0 | 5160 | 0.1432 | 0.0886 | 0.0253 |
0.1177 | 61.0 | 5246 | 0.1467 | 0.0871 | 0.0250 |
0.1249 | 62.0 | 5332 | 0.1498 | 0.0888 | 0.0257 |
0.1332 | 63.0 | 5418 | 0.1493 | 0.0898 | 0.0260 |
0.142 | 64.0 | 5504 | 0.1464 | 0.0906 | 0.0258 |
0.142 | 65.0 | 5590 | 0.1434 | 0.0906 | 0.0256 |
0.1441 | 66.0 | 5676 | 0.1465 | 0.0923 | 0.0261 |
0.1252 | 67.0 | 5762 | 0.1449 | 0.0906 | 0.0256 |
0.1282 | 68.0 | 5848 | 0.1461 | 0.0896 | 0.0257 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3