<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-grade-2-3-4
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1586
- Wer: 0.0892
- Cer: 0.0275
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
30.6989 | 0.99 | 68 | 3.5420 | 1.0 | 1.0 |
8.5369 | 2.0 | 137 | 3.0212 | 1.0 | 1.0 |
3.0692 | 2.99 | 205 | 2.9516 | 1.0 | 1.0 |
3.0692 | 4.0 | 274 | 2.9373 | 1.0 | 1.0 |
2.9406 | 4.99 | 342 | 2.9210 | 1.0 | 1.0 |
2.9093 | 6.0 | 411 | 2.8973 | 1.0 | 1.0 |
2.9093 | 6.99 | 479 | 2.3652 | 0.9972 | 0.9341 |
2.6975 | 8.0 | 548 | 0.9068 | 0.6196 | 0.1447 |
1.3766 | 8.99 | 616 | 0.4947 | 0.2336 | 0.0597 |
1.3766 | 10.0 | 685 | 0.3844 | 0.1871 | 0.0510 |
0.7674 | 10.99 | 753 | 0.3305 | 0.1664 | 0.0456 |
0.5472 | 12.0 | 822 | 0.2951 | 0.1605 | 0.0441 |
0.5472 | 12.99 | 890 | 0.2710 | 0.1510 | 0.0420 |
0.459 | 14.0 | 959 | 0.2544 | 0.1308 | 0.0378 |
0.4152 | 14.99 | 1027 | 0.2374 | 0.1203 | 0.0355 |
0.4152 | 16.0 | 1096 | 0.2410 | 0.1203 | 0.0352 |
0.3714 | 16.99 | 1164 | 0.2248 | 0.1154 | 0.0338 |
0.3366 | 18.0 | 1233 | 0.2125 | 0.1042 | 0.0310 |
0.3209 | 18.99 | 1301 | 0.2077 | 0.1091 | 0.0318 |
0.3209 | 20.0 | 1370 | 0.2111 | 0.1024 | 0.0306 |
0.2993 | 20.99 | 1438 | 0.2023 | 0.1056 | 0.0316 |
0.2786 | 22.0 | 1507 | 0.2000 | 0.1007 | 0.0298 |
0.2786 | 22.99 | 1575 | 0.1906 | 0.1024 | 0.0298 |
0.2762 | 24.0 | 1644 | 0.1913 | 0.1017 | 0.0309 |
0.2352 | 24.99 | 1712 | 0.1957 | 0.0993 | 0.0316 |
0.2352 | 26.0 | 1781 | 0.1946 | 0.1045 | 0.0318 |
0.2341 | 26.99 | 1849 | 0.1852 | 0.0990 | 0.0300 |
0.2272 | 28.0 | 1918 | 0.1853 | 0.0972 | 0.0296 |
0.2272 | 28.99 | 1986 | 0.1873 | 0.0993 | 0.0293 |
0.2057 | 30.0 | 2055 | 0.1834 | 0.0920 | 0.0286 |
0.2054 | 30.99 | 2123 | 0.1854 | 0.0937 | 0.0287 |
0.2054 | 32.0 | 2192 | 0.1729 | 0.0913 | 0.0277 |
0.1995 | 32.99 | 2260 | 0.1726 | 0.0909 | 0.0280 |
0.1975 | 34.0 | 2329 | 0.1744 | 0.0934 | 0.0280 |
0.1975 | 34.99 | 2397 | 0.1728 | 0.0906 | 0.0279 |
0.1943 | 36.0 | 2466 | 0.1725 | 0.0909 | 0.0279 |
0.1818 | 36.99 | 2534 | 0.1683 | 0.0881 | 0.0275 |
0.1902 | 38.0 | 2603 | 0.1700 | 0.0899 | 0.0275 |
0.1902 | 38.99 | 2671 | 0.1713 | 0.0874 | 0.0269 |
0.1823 | 40.0 | 2740 | 0.1700 | 0.0906 | 0.0278 |
0.172 | 40.99 | 2808 | 0.1732 | 0.0923 | 0.0274 |
0.172 | 42.0 | 2877 | 0.1683 | 0.0871 | 0.0267 |
0.183 | 42.99 | 2945 | 0.1634 | 0.0916 | 0.0275 |
0.1847 | 44.0 | 3014 | 0.1681 | 0.0867 | 0.0268 |
0.1847 | 44.99 | 3082 | 0.1673 | 0.0902 | 0.0275 |
0.1679 | 46.0 | 3151 | 0.1707 | 0.0860 | 0.0266 |
0.1553 | 46.99 | 3219 | 0.1675 | 0.0853 | 0.0267 |
0.1553 | 48.0 | 3288 | 0.1716 | 0.0902 | 0.0279 |
0.1666 | 48.99 | 3356 | 0.1673 | 0.0857 | 0.0272 |
0.1556 | 50.0 | 3425 | 0.1681 | 0.0874 | 0.0268 |
0.1556 | 50.99 | 3493 | 0.1640 | 0.0867 | 0.0268 |
0.1598 | 52.0 | 3562 | 0.1586 | 0.0892 | 0.0275 |
0.1516 | 52.99 | 3630 | 0.1702 | 0.0867 | 0.0272 |
0.1516 | 54.0 | 3699 | 0.1667 | 0.0839 | 0.0260 |
0.1456 | 54.99 | 3767 | 0.1671 | 0.0846 | 0.0264 |
0.1481 | 56.0 | 3836 | 0.1708 | 0.0864 | 0.0275 |
0.1474 | 56.99 | 3904 | 0.1680 | 0.0829 | 0.0263 |
0.1474 | 58.0 | 3973 | 0.1676 | 0.0853 | 0.0265 |
0.1402 | 58.99 | 4041 | 0.1682 | 0.0860 | 0.0262 |
0.1374 | 60.0 | 4110 | 0.1714 | 0.0832 | 0.0261 |
0.1374 | 60.99 | 4178 | 0.1696 | 0.0822 | 0.0256 |
0.1327 | 62.0 | 4247 | 0.1669 | 0.0836 | 0.0257 |
0.1301 | 62.99 | 4315 | 0.1687 | 0.0822 | 0.0261 |
0.1301 | 64.0 | 4384 | 0.1711 | 0.0846 | 0.0261 |
0.1455 | 64.99 | 4452 | 0.1735 | 0.0853 | 0.0265 |
0.1208 | 66.0 | 4521 | 0.1731 | 0.0839 | 0.0264 |
0.1208 | 66.99 | 4589 | 0.1766 | 0.0832 | 0.0261 |
0.1327 | 68.0 | 4658 | 0.1756 | 0.0818 | 0.0257 |
0.1345 | 68.99 | 4726 | 0.1737 | 0.0874 | 0.0266 |
0.1345 | 70.0 | 4795 | 0.1737 | 0.0850 | 0.0260 |
0.1255 | 70.99 | 4863 | 0.1712 | 0.0808 | 0.0257 |
0.1404 | 72.0 | 4932 | 0.1737 | 0.0836 | 0.0261 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3