<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mec-ita-coraa-portuguese-all-09
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1518
- Wer: 0.0933
- Cer: 0.0305
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
29.5822 | 1.0 | 86 | 3.2519 | 1.0 | 1.0 |
7.4163 | 2.0 | 172 | 2.9510 | 1.0 | 1.0 |
3.0543 | 3.0 | 258 | 2.9271 | 1.0 | 1.0 |
2.9256 | 4.0 | 344 | 2.8872 | 1.0 | 1.0 |
2.8789 | 5.0 | 430 | 2.4985 | 0.9978 | 0.9154 |
2.252 | 6.0 | 516 | 0.8675 | 0.6029 | 0.1494 |
1.0699 | 7.0 | 602 | 0.4857 | 0.2424 | 0.0681 |
1.0699 | 8.0 | 688 | 0.3746 | 0.1884 | 0.0568 |
0.7411 | 9.0 | 774 | 0.3188 | 0.1704 | 0.0520 |
0.5815 | 10.0 | 860 | 0.2876 | 0.1653 | 0.0497 |
0.5181 | 11.0 | 946 | 0.2563 | 0.1513 | 0.0467 |
0.4439 | 12.0 | 1032 | 0.2341 | 0.1504 | 0.0448 |
0.3882 | 13.0 | 1118 | 0.2330 | 0.1441 | 0.0457 |
0.3642 | 14.0 | 1204 | 0.2141 | 0.1328 | 0.0417 |
0.3642 | 15.0 | 1290 | 0.2043 | 0.1186 | 0.0384 |
0.3497 | 16.0 | 1376 | 0.1955 | 0.1178 | 0.0380 |
0.309 | 17.0 | 1462 | 0.1921 | 0.1145 | 0.0373 |
0.291 | 18.0 | 1548 | 0.1869 | 0.1157 | 0.0361 |
0.2904 | 19.0 | 1634 | 0.1778 | 0.1111 | 0.0358 |
0.2961 | 20.0 | 1720 | 0.1739 | 0.1067 | 0.0352 |
0.2796 | 21.0 | 1806 | 0.1736 | 0.1108 | 0.0360 |
0.2796 | 22.0 | 1892 | 0.1685 | 0.1094 | 0.0355 |
0.2666 | 23.0 | 1978 | 0.1707 | 0.1092 | 0.0355 |
0.2404 | 24.0 | 2064 | 0.1724 | 0.1125 | 0.0358 |
0.2284 | 25.0 | 2150 | 0.1677 | 0.1077 | 0.0349 |
0.2254 | 26.0 | 2236 | 0.1650 | 0.1084 | 0.0342 |
0.2071 | 27.0 | 2322 | 0.1621 | 0.1010 | 0.0335 |
0.2107 | 28.0 | 2408 | 0.1613 | 0.1027 | 0.0337 |
0.2107 | 29.0 | 2494 | 0.1646 | 0.1048 | 0.0339 |
0.2133 | 30.0 | 2580 | 0.1639 | 0.0995 | 0.0323 |
0.2032 | 31.0 | 2666 | 0.1624 | 0.1002 | 0.0327 |
0.216 | 32.0 | 2752 | 0.1610 | 0.1017 | 0.0328 |
0.2075 | 33.0 | 2838 | 0.1593 | 0.0986 | 0.0319 |
0.1946 | 34.0 | 2924 | 0.1617 | 0.0983 | 0.0316 |
0.1731 | 35.0 | 3010 | 0.1626 | 0.0993 | 0.0320 |
0.1731 | 36.0 | 3096 | 0.1586 | 0.0998 | 0.0324 |
0.1763 | 37.0 | 3182 | 0.1594 | 0.0986 | 0.0318 |
0.1982 | 38.0 | 3268 | 0.1570 | 0.0993 | 0.0319 |
0.1877 | 39.0 | 3354 | 0.1560 | 0.0993 | 0.0316 |
0.1674 | 40.0 | 3440 | 0.1560 | 0.0986 | 0.0318 |
0.1582 | 41.0 | 3526 | 0.1548 | 0.0995 | 0.0314 |
0.1653 | 42.0 | 3612 | 0.1561 | 0.0957 | 0.0308 |
0.1653 | 43.0 | 3698 | 0.1570 | 0.0969 | 0.0311 |
0.1572 | 44.0 | 3784 | 0.1553 | 0.0961 | 0.0310 |
0.1693 | 45.0 | 3870 | 0.1568 | 0.0952 | 0.0308 |
0.1651 | 46.0 | 3956 | 0.1587 | 0.1 | 0.0318 |
0.1468 | 47.0 | 4042 | 0.1567 | 0.1012 | 0.0324 |
0.1592 | 48.0 | 4128 | 0.1572 | 0.0976 | 0.0313 |
0.1556 | 49.0 | 4214 | 0.1535 | 0.0949 | 0.0306 |
0.1431 | 50.0 | 4300 | 0.1560 | 0.0978 | 0.0314 |
0.1431 | 51.0 | 4386 | 0.1547 | 0.0937 | 0.0304 |
0.1461 | 52.0 | 4472 | 0.1542 | 0.0969 | 0.0304 |
0.1463 | 53.0 | 4558 | 0.1536 | 0.0954 | 0.0307 |
0.1447 | 54.0 | 4644 | 0.1561 | 0.0925 | 0.0301 |
0.1391 | 55.0 | 4730 | 0.1518 | 0.0933 | 0.0305 |
0.1422 | 56.0 | 4816 | 0.1558 | 0.0978 | 0.0311 |
0.1415 | 57.0 | 4902 | 0.1520 | 0.0945 | 0.0304 |
0.1415 | 58.0 | 4988 | 0.1555 | 0.0978 | 0.0310 |
0.1323 | 59.0 | 5074 | 0.1574 | 0.0933 | 0.0299 |
0.1294 | 60.0 | 5160 | 0.1569 | 0.0942 | 0.0304 |
0.1296 | 61.0 | 5246 | 0.1539 | 0.0945 | 0.0305 |
0.1349 | 62.0 | 5332 | 0.1533 | 0.0896 | 0.0298 |
0.1303 | 63.0 | 5418 | 0.1537 | 0.0906 | 0.0299 |
0.1426 | 64.0 | 5504 | 0.1551 | 0.0884 | 0.0293 |
0.1426 | 65.0 | 5590 | 0.1569 | 0.0923 | 0.0301 |
0.1231 | 66.0 | 5676 | 0.1553 | 0.0882 | 0.0295 |
0.1338 | 67.0 | 5762 | 0.1583 | 0.0889 | 0.0296 |
0.1249 | 68.0 | 5848 | 0.1567 | 0.0913 | 0.0300 |
0.1289 | 69.0 | 5934 | 0.1577 | 0.0933 | 0.0306 |
0.1228 | 70.0 | 6020 | 0.1537 | 0.0916 | 0.0304 |
0.1372 | 71.0 | 6106 | 0.1533 | 0.0930 | 0.0302 |
0.1372 | 72.0 | 6192 | 0.1562 | 0.0920 | 0.0305 |
0.1211 | 73.0 | 6278 | 0.1571 | 0.0906 | 0.0301 |
0.1211 | 74.0 | 6364 | 0.1557 | 0.0916 | 0.0302 |
0.1185 | 75.0 | 6450 | 0.1572 | 0.0889 | 0.0304 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3