<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xls-r-300m-guarani-small-wb
This model is a fine-tuned version of glob-asr/wav2vec2-large-xls-r-300m-guarani-small on the common_voice dataset. It achieves the following results on the evaluation set:
- Loss: 0.1622
- Wer: 0.2446
- Cer: 0.0368
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.1818 | 0.32 | 10 | 0.1196 | 0.2146 | 0.0305 |
0.2953 | 0.65 | 20 | 0.1801 | 0.3090 | 0.0426 |
0.2941 | 0.97 | 30 | 0.1935 | 0.3090 | 0.0420 |
0.2786 | 1.29 | 40 | 0.1899 | 0.3305 | 0.0483 |
0.2665 | 1.61 | 50 | 0.1716 | 0.3176 | 0.0454 |
0.2752 | 1.94 | 60 | 0.1895 | 0.3948 | 0.0564 |
0.2482 | 2.26 | 70 | 0.1753 | 0.3176 | 0.0449 |
0.2486 | 2.58 | 80 | 0.1501 | 0.2747 | 0.0403 |
0.2878 | 2.9 | 90 | 0.1890 | 0.3348 | 0.0529 |
0.2539 | 3.23 | 100 | 0.2076 | 0.4635 | 0.0610 |
0.2069 | 3.55 | 110 | 0.1711 | 0.3476 | 0.0466 |
0.2262 | 3.87 | 120 | 0.1839 | 0.3605 | 0.0500 |
0.2032 | 4.19 | 130 | 0.1724 | 0.3391 | 0.0489 |
0.1997 | 4.52 | 140 | 0.1498 | 0.2704 | 0.0414 |
0.2216 | 4.84 | 150 | 0.1531 | 0.3047 | 0.0472 |
0.2294 | 5.16 | 160 | 0.1882 | 0.3176 | 0.0500 |
0.2305 | 5.48 | 170 | 0.1799 | 0.3176 | 0.0483 |
0.2052 | 5.81 | 180 | 0.1645 | 0.3262 | 0.0477 |
0.2192 | 6.13 | 190 | 0.1439 | 0.2060 | 0.0339 |
0.1844 | 6.45 | 200 | 0.1557 | 0.2918 | 0.0403 |
0.1803 | 6.77 | 210 | 0.1664 | 0.3004 | 0.0426 |
0.1831 | 7.1 | 220 | 0.1780 | 0.3176 | 0.0477 |
0.1618 | 7.42 | 230 | 0.1671 | 0.2661 | 0.0437 |
0.1528 | 7.74 | 240 | 0.2108 | 0.3176 | 0.0506 |
0.1335 | 8.06 | 250 | 0.1677 | 0.2575 | 0.0408 |
0.1736 | 8.39 | 260 | 0.1581 | 0.3004 | 0.0460 |
0.1607 | 8.71 | 270 | 0.1529 | 0.3047 | 0.0403 |
0.1451 | 9.03 | 280 | 0.1666 | 0.2747 | 0.0408 |
0.1534 | 9.35 | 290 | 0.1722 | 0.2833 | 0.0437 |
0.1567 | 9.68 | 300 | 0.1747 | 0.2918 | 0.0397 |
0.1356 | 10.0 | 310 | 0.1659 | 0.2961 | 0.0443 |
0.1248 | 10.32 | 320 | 0.1752 | 0.3348 | 0.0449 |
0.149 | 10.65 | 330 | 0.1792 | 0.3348 | 0.0449 |
0.1471 | 10.97 | 340 | 0.1843 | 0.3391 | 0.0460 |
0.1564 | 11.29 | 350 | 0.2015 | 0.3433 | 0.0460 |
0.1597 | 11.61 | 360 | 0.1798 | 0.2618 | 0.0380 |
0.161 | 11.94 | 370 | 0.1716 | 0.2747 | 0.0374 |
0.1481 | 12.26 | 380 | 0.1776 | 0.2747 | 0.0397 |
0.1168 | 12.58 | 390 | 0.1900 | 0.2961 | 0.0454 |
0.1173 | 12.9 | 400 | 0.1987 | 0.3090 | 0.0454 |
0.1245 | 13.23 | 410 | 0.1710 | 0.2918 | 0.0408 |
0.1118 | 13.55 | 420 | 0.1808 | 0.3047 | 0.0431 |
0.1111 | 13.87 | 430 | 0.1893 | 0.2747 | 0.0403 |
0.1041 | 14.19 | 440 | 0.1876 | 0.2918 | 0.0431 |
0.1152 | 14.52 | 450 | 0.1800 | 0.2790 | 0.0408 |
0.107 | 14.84 | 460 | 0.1717 | 0.2747 | 0.0385 |
0.1139 | 15.16 | 470 | 0.1652 | 0.2704 | 0.0391 |
0.0922 | 15.48 | 480 | 0.1659 | 0.2618 | 0.0391 |
0.101 | 15.81 | 490 | 0.1610 | 0.2489 | 0.0362 |
0.0835 | 16.13 | 500 | 0.1584 | 0.2403 | 0.0362 |
0.1251 | 16.45 | 510 | 0.1601 | 0.2575 | 0.0380 |
0.0888 | 16.77 | 520 | 0.1632 | 0.2661 | 0.0380 |
0.0968 | 17.1 | 530 | 0.1674 | 0.2661 | 0.0385 |
0.1105 | 17.42 | 540 | 0.1629 | 0.2833 | 0.0391 |
0.0914 | 17.74 | 550 | 0.1623 | 0.3090 | 0.0408 |
0.0843 | 18.06 | 560 | 0.1611 | 0.3004 | 0.0408 |
0.0861 | 18.39 | 570 | 0.1583 | 0.2661 | 0.0385 |
0.0861 | 18.71 | 580 | 0.1579 | 0.2618 | 0.0385 |
0.0678 | 19.03 | 590 | 0.1585 | 0.2661 | 0.0374 |
0.0934 | 19.35 | 600 | 0.1613 | 0.2489 | 0.0368 |
0.0976 | 19.68 | 610 | 0.1617 | 0.2446 | 0.0368 |
0.0799 | 20.0 | 620 | 0.1622 | 0.2446 | 0.0368 |
Framework versions
- Transformers 4.18.0
- Pytorch 1.11.0+cu113
- Datasets 2.1.0
- Tokenizers 0.12.1