<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-clean-grade-3-4
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0997
- Wer: 0.0725
- Cer: 0.0199
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
19.5243 | 0.98 | 21 | 8.9078 | 1.0 | 1.0 |
19.5243 | 2.0 | 43 | 3.6694 | 1.0 | 1.0 |
19.5243 | 2.98 | 64 | 3.2621 | 1.0 | 1.0 |
19.5243 | 4.0 | 86 | 3.0901 | 1.0 | 1.0 |
6.3988 | 4.98 | 107 | 2.9944 | 1.0 | 1.0 |
6.3988 | 6.0 | 129 | 2.9695 | 1.0 | 1.0 |
6.3988 | 6.98 | 150 | 2.9320 | 1.0 | 1.0 |
6.3988 | 8.0 | 172 | 2.8941 | 1.0 | 1.0 |
6.3988 | 8.98 | 193 | 2.9124 | 1.0 | 1.0 |
2.971 | 10.0 | 215 | 2.8778 | 1.0 | 1.0 |
2.971 | 10.98 | 236 | 2.8657 | 1.0 | 1.0 |
2.971 | 12.0 | 258 | 2.8637 | 1.0 | 1.0 |
2.971 | 12.98 | 279 | 2.8621 | 1.0 | 1.0 |
2.889 | 14.0 | 301 | 2.8318 | 1.0 | 1.0 |
2.889 | 14.98 | 322 | 2.7810 | 1.0 | 1.0 |
2.889 | 16.0 | 344 | 2.6601 | 1.0 | 1.0 |
2.889 | 16.98 | 365 | 2.4316 | 1.0 | 0.9763 |
2.889 | 18.0 | 387 | 1.9166 | 1.0 | 0.6286 |
2.6018 | 18.98 | 408 | 1.2662 | 0.9992 | 0.3170 |
2.6018 | 20.0 | 430 | 0.8503 | 0.6080 | 0.1299 |
2.6018 | 20.98 | 451 | 0.6382 | 0.2416 | 0.0589 |
2.6018 | 22.0 | 473 | 0.4953 | 0.2136 | 0.0522 |
2.6018 | 22.98 | 494 | 0.4245 | 0.1691 | 0.0428 |
1.2037 | 24.0 | 516 | 0.3579 | 0.1590 | 0.0397 |
1.2037 | 24.98 | 537 | 0.3192 | 0.1535 | 0.0381 |
1.2037 | 26.0 | 559 | 0.2822 | 0.1395 | 0.0343 |
1.2037 | 26.98 | 580 | 0.2608 | 0.1325 | 0.0335 |
0.6376 | 28.0 | 602 | 0.2446 | 0.1387 | 0.0344 |
0.6376 | 28.98 | 623 | 0.2264 | 0.1356 | 0.0344 |
0.6376 | 30.0 | 645 | 0.2110 | 0.1263 | 0.0318 |
0.6376 | 30.98 | 666 | 0.2038 | 0.1270 | 0.0319 |
0.6376 | 32.0 | 688 | 0.1949 | 0.1185 | 0.0312 |
0.4842 | 32.98 | 709 | 0.1852 | 0.1154 | 0.0305 |
0.4842 | 34.0 | 731 | 0.1804 | 0.1138 | 0.0293 |
0.4842 | 34.98 | 752 | 0.1712 | 0.1091 | 0.0286 |
0.4842 | 36.0 | 774 | 0.1681 | 0.1083 | 0.0287 |
0.4842 | 36.98 | 795 | 0.1629 | 0.1076 | 0.0281 |
0.3688 | 38.0 | 817 | 0.1559 | 0.1044 | 0.0274 |
0.3688 | 38.98 | 838 | 0.1507 | 0.0943 | 0.0253 |
0.3688 | 40.0 | 860 | 0.1489 | 0.0920 | 0.0249 |
0.3688 | 40.98 | 881 | 0.1445 | 0.0912 | 0.0249 |
0.3144 | 42.0 | 903 | 0.1424 | 0.0865 | 0.0239 |
0.3144 | 42.98 | 924 | 0.1414 | 0.0881 | 0.0246 |
0.3144 | 44.0 | 946 | 0.1393 | 0.0834 | 0.0230 |
0.3144 | 44.98 | 967 | 0.1360 | 0.0803 | 0.0231 |
0.3144 | 46.0 | 989 | 0.1330 | 0.0811 | 0.0233 |
0.3049 | 46.98 | 1010 | 0.1319 | 0.0834 | 0.0234 |
0.3049 | 48.0 | 1032 | 0.1285 | 0.0795 | 0.0218 |
0.3049 | 48.98 | 1053 | 0.1250 | 0.0779 | 0.0218 |
0.3049 | 50.0 | 1075 | 0.1271 | 0.0787 | 0.0227 |
0.3049 | 50.98 | 1096 | 0.1254 | 0.0787 | 0.0223 |
0.27 | 52.0 | 1118 | 0.1223 | 0.0779 | 0.0221 |
0.27 | 52.98 | 1139 | 0.1208 | 0.0764 | 0.0211 |
0.27 | 54.0 | 1161 | 0.1174 | 0.0764 | 0.0212 |
0.27 | 54.98 | 1182 | 0.1150 | 0.0779 | 0.0211 |
0.2473 | 56.0 | 1204 | 0.1154 | 0.0748 | 0.0207 |
0.2473 | 56.98 | 1225 | 0.1194 | 0.0795 | 0.0211 |
0.2473 | 58.0 | 1247 | 0.1131 | 0.0740 | 0.0205 |
0.2473 | 58.98 | 1268 | 0.1133 | 0.0717 | 0.0199 |
0.2473 | 60.0 | 1290 | 0.1119 | 0.0725 | 0.0207 |
0.2317 | 60.98 | 1311 | 0.1110 | 0.0694 | 0.0201 |
0.2317 | 62.0 | 1333 | 0.1091 | 0.0701 | 0.0201 |
0.2317 | 62.98 | 1354 | 0.1076 | 0.0709 | 0.0204 |
0.2317 | 64.0 | 1376 | 0.1073 | 0.0740 | 0.0208 |
0.2317 | 64.98 | 1397 | 0.1078 | 0.0733 | 0.0201 |
0.2192 | 66.0 | 1419 | 0.1058 | 0.0701 | 0.0202 |
0.2192 | 66.98 | 1440 | 0.1046 | 0.0709 | 0.0198 |
0.2192 | 68.0 | 1462 | 0.1026 | 0.0678 | 0.0193 |
0.2192 | 68.98 | 1483 | 0.1032 | 0.0686 | 0.0195 |
0.2204 | 70.0 | 1505 | 0.1030 | 0.0686 | 0.0195 |
0.2204 | 70.98 | 1526 | 0.1028 | 0.0694 | 0.0198 |
0.2204 | 72.0 | 1548 | 0.1039 | 0.0717 | 0.0199 |
0.2204 | 72.98 | 1569 | 0.1021 | 0.0717 | 0.0199 |
0.2204 | 74.0 | 1591 | 0.1009 | 0.0701 | 0.0196 |
0.1945 | 74.98 | 1612 | 0.1019 | 0.0709 | 0.0196 |
0.1945 | 76.0 | 1634 | 0.1023 | 0.0694 | 0.0195 |
0.1945 | 76.98 | 1655 | 0.1020 | 0.0686 | 0.0195 |
0.1945 | 78.0 | 1677 | 0.1017 | 0.0701 | 0.0196 |
0.1945 | 78.98 | 1698 | 0.1022 | 0.0701 | 0.0195 |
0.2261 | 80.0 | 1720 | 0.1013 | 0.0725 | 0.0201 |
0.2261 | 80.98 | 1741 | 0.1004 | 0.0701 | 0.0196 |
0.2261 | 82.0 | 1763 | 0.1003 | 0.0686 | 0.0198 |
0.2261 | 82.98 | 1784 | 0.1009 | 0.0709 | 0.0199 |
0.1991 | 84.0 | 1806 | 0.1004 | 0.0709 | 0.0196 |
0.1991 | 84.98 | 1827 | 0.1016 | 0.0740 | 0.0201 |
0.1991 | 86.0 | 1849 | 0.1006 | 0.0733 | 0.0199 |
0.1991 | 86.98 | 1870 | 0.1012 | 0.0740 | 0.0199 |
0.1991 | 88.0 | 1892 | 0.1011 | 0.0756 | 0.0204 |
0.1855 | 88.98 | 1913 | 0.1013 | 0.0756 | 0.0204 |
0.1855 | 90.0 | 1935 | 0.1009 | 0.0725 | 0.0198 |
0.1855 | 90.98 | 1956 | 0.1016 | 0.0748 | 0.0202 |
0.1855 | 92.0 | 1978 | 0.1010 | 0.0733 | 0.0199 |
0.1855 | 92.98 | 1999 | 0.1005 | 0.0748 | 0.0202 |
0.1888 | 94.0 | 2021 | 0.0998 | 0.0748 | 0.0202 |
0.1888 | 94.98 | 2042 | 0.0997 | 0.0725 | 0.0199 |
0.1888 | 96.0 | 2064 | 0.1003 | 0.0740 | 0.0201 |
0.1888 | 96.98 | 2085 | 0.0999 | 0.0740 | 0.0201 |
0.1865 | 97.67 | 2100 | 0.0999 | 0.0740 | 0.0201 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3