<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-grade-3-4-5
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0915
- Wer: 0.0756
- Cer: 0.0217
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
18.7965 | 0.99 | 45 | 3.5284 | 1.0 | 1.0 |
18.7965 | 2.0 | 91 | 3.0619 | 1.0 | 1.0 |
5.944 | 2.99 | 136 | 2.9381 | 1.0 | 1.0 |
5.944 | 4.0 | 182 | 2.8987 | 1.0 | 1.0 |
2.9722 | 4.99 | 227 | 2.8854 | 1.0 | 1.0 |
2.9722 | 6.0 | 273 | 2.8774 | 1.0 | 1.0 |
2.8979 | 6.99 | 318 | 2.8751 | 1.0 | 1.0 |
2.8979 | 8.0 | 364 | 2.8691 | 1.0 | 1.0 |
2.8767 | 8.99 | 409 | 2.8725 | 1.0 | 1.0 |
2.8767 | 10.0 | 455 | 2.8614 | 1.0 | 1.0 |
2.8568 | 10.99 | 500 | 2.8114 | 1.0 | 1.0 |
2.8568 | 12.0 | 546 | 2.4615 | 1.0 | 1.0 |
2.8568 | 12.99 | 591 | 1.4113 | 0.9084 | 0.3135 |
2.4392 | 14.0 | 637 | 0.6077 | 0.2812 | 0.0723 |
2.4392 | 14.99 | 682 | 0.3951 | 0.2160 | 0.0552 |
1.0208 | 16.0 | 728 | 0.3029 | 0.1687 | 0.0438 |
1.0208 | 16.99 | 773 | 0.2579 | 0.1616 | 0.0424 |
0.5843 | 18.0 | 819 | 0.2274 | 0.1557 | 0.0402 |
0.5843 | 18.99 | 864 | 0.2059 | 0.1430 | 0.0376 |
0.4893 | 20.0 | 910 | 0.1945 | 0.1512 | 0.0382 |
0.4893 | 20.99 | 955 | 0.1795 | 0.1307 | 0.0347 |
0.415 | 22.0 | 1001 | 0.1694 | 0.1274 | 0.0331 |
0.415 | 22.99 | 1046 | 0.1613 | 0.1162 | 0.0314 |
0.415 | 24.0 | 1092 | 0.1522 | 0.1069 | 0.0291 |
0.3513 | 24.99 | 1137 | 0.1467 | 0.0980 | 0.0273 |
0.3513 | 26.0 | 1183 | 0.1446 | 0.0972 | 0.0278 |
0.3296 | 26.99 | 1228 | 0.1358 | 0.0935 | 0.0263 |
0.3296 | 28.0 | 1274 | 0.1321 | 0.0879 | 0.0253 |
0.2982 | 28.99 | 1319 | 0.1287 | 0.0868 | 0.0253 |
0.2982 | 30.0 | 1365 | 0.1251 | 0.0860 | 0.0254 |
0.2522 | 30.99 | 1410 | 0.1228 | 0.0845 | 0.0252 |
0.2522 | 32.0 | 1456 | 0.1193 | 0.0868 | 0.0245 |
0.2394 | 32.99 | 1501 | 0.1175 | 0.0823 | 0.0239 |
0.2394 | 34.0 | 1547 | 0.1151 | 0.0823 | 0.0247 |
0.2394 | 34.99 | 1592 | 0.1132 | 0.0868 | 0.0246 |
0.2315 | 36.0 | 1638 | 0.1117 | 0.0816 | 0.0237 |
0.2315 | 36.99 | 1683 | 0.1112 | 0.0804 | 0.0233 |
0.2274 | 38.0 | 1729 | 0.1095 | 0.0827 | 0.0237 |
0.2274 | 38.99 | 1774 | 0.1084 | 0.0775 | 0.0222 |
0.2139 | 40.0 | 1820 | 0.1073 | 0.0790 | 0.0231 |
0.2139 | 40.99 | 1865 | 0.1033 | 0.0726 | 0.0217 |
0.2021 | 42.0 | 1911 | 0.1046 | 0.0760 | 0.0224 |
0.2021 | 42.99 | 1956 | 0.1025 | 0.0764 | 0.0217 |
0.1979 | 44.0 | 2002 | 0.1026 | 0.0760 | 0.0218 |
0.1979 | 44.99 | 2047 | 0.1050 | 0.0771 | 0.0228 |
0.1979 | 46.0 | 2093 | 0.1035 | 0.0760 | 0.0223 |
0.1958 | 46.99 | 2138 | 0.1033 | 0.0778 | 0.0224 |
0.1958 | 48.0 | 2184 | 0.1044 | 0.0797 | 0.0226 |
0.2233 | 48.99 | 2229 | 0.1028 | 0.0760 | 0.0223 |
0.2233 | 50.0 | 2275 | 0.1002 | 0.0771 | 0.0226 |
0.1843 | 50.99 | 2320 | 0.0988 | 0.0767 | 0.0222 |
0.1843 | 52.0 | 2366 | 0.0976 | 0.0741 | 0.0223 |
0.1729 | 52.99 | 2411 | 0.0976 | 0.0734 | 0.0220 |
0.1729 | 54.0 | 2457 | 0.0966 | 0.0745 | 0.0224 |
0.1823 | 54.99 | 2502 | 0.0971 | 0.0760 | 0.0221 |
0.1823 | 56.0 | 2548 | 0.0959 | 0.0756 | 0.0220 |
0.1823 | 56.99 | 2593 | 0.0939 | 0.0749 | 0.0224 |
0.1778 | 58.0 | 2639 | 0.0929 | 0.0749 | 0.0218 |
0.1778 | 58.99 | 2684 | 0.0949 | 0.0752 | 0.0225 |
0.1607 | 60.0 | 2730 | 0.0956 | 0.0771 | 0.0226 |
0.1607 | 60.99 | 2775 | 0.0938 | 0.0767 | 0.0223 |
0.1722 | 62.0 | 2821 | 0.0941 | 0.0741 | 0.0217 |
0.1722 | 62.99 | 2866 | 0.0937 | 0.0741 | 0.0217 |
0.1599 | 64.0 | 2912 | 0.0918 | 0.0771 | 0.0218 |
0.1599 | 64.99 | 2957 | 0.0929 | 0.0790 | 0.0220 |
0.1663 | 66.0 | 3003 | 0.0932 | 0.0790 | 0.0227 |
0.1663 | 66.99 | 3048 | 0.0918 | 0.0760 | 0.0219 |
0.1663 | 68.0 | 3094 | 0.0930 | 0.0745 | 0.0219 |
0.1625 | 68.99 | 3139 | 0.0930 | 0.0775 | 0.0221 |
0.1625 | 70.0 | 3185 | 0.0918 | 0.0767 | 0.0220 |
0.1661 | 70.99 | 3230 | 0.0915 | 0.0756 | 0.0217 |
0.1661 | 72.0 | 3276 | 0.0934 | 0.0786 | 0.0224 |
0.1481 | 72.99 | 3321 | 0.0934 | 0.0797 | 0.0224 |
0.1481 | 74.0 | 3367 | 0.0919 | 0.0749 | 0.0221 |
0.1481 | 74.99 | 3412 | 0.0930 | 0.0752 | 0.0224 |
0.1481 | 76.0 | 3458 | 0.0925 | 0.0756 | 0.0220 |
0.1466 | 76.99 | 3503 | 0.0916 | 0.0745 | 0.0220 |
0.1466 | 78.0 | 3549 | 0.0931 | 0.0734 | 0.0223 |
0.1466 | 78.99 | 3594 | 0.0931 | 0.0756 | 0.0220 |
0.1562 | 80.0 | 3640 | 0.0917 | 0.0726 | 0.0217 |
0.1562 | 80.99 | 3685 | 0.0930 | 0.0749 | 0.0223 |
0.1485 | 82.0 | 3731 | 0.0931 | 0.0749 | 0.0222 |
0.1485 | 82.99 | 3776 | 0.0938 | 0.0756 | 0.0220 |
0.1402 | 84.0 | 3822 | 0.0931 | 0.0737 | 0.0222 |
0.1402 | 84.99 | 3867 | 0.0939 | 0.0778 | 0.0229 |
0.1485 | 86.0 | 3913 | 0.0942 | 0.0749 | 0.0224 |
0.1485 | 86.99 | 3958 | 0.0936 | 0.0741 | 0.0224 |
0.1451 | 88.0 | 4004 | 0.0938 | 0.0756 | 0.0224 |
0.1451 | 88.99 | 4049 | 0.0928 | 0.0760 | 0.0226 |
0.1451 | 90.0 | 4095 | 0.0936 | 0.0764 | 0.0226 |
0.137 | 90.99 | 4140 | 0.0940 | 0.0760 | 0.0225 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3