<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-clean-grade-3
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1370
- Wer: 0.0733
- Cer: 0.0225
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
23.8744 | 0.97 | 15 | 11.7694 | 1.0 | 1.0 |
23.8744 | 2.0 | 31 | 4.4200 | 1.0 | 1.0 |
23.8744 | 2.97 | 46 | 3.5590 | 1.0 | 1.0 |
23.8744 | 4.0 | 62 | 3.2929 | 1.0 | 1.0 |
23.8744 | 4.97 | 77 | 3.1352 | 1.0 | 1.0 |
23.8744 | 6.0 | 93 | 3.0567 | 1.0 | 1.0 |
6.5381 | 6.97 | 108 | 3.0525 | 1.0 | 1.0 |
6.5381 | 8.0 | 124 | 2.9529 | 1.0 | 1.0 |
6.5381 | 8.97 | 139 | 2.9348 | 1.0 | 1.0 |
6.5381 | 10.0 | 155 | 2.9137 | 1.0 | 1.0 |
6.5381 | 10.97 | 170 | 2.9253 | 1.0 | 1.0 |
6.5381 | 12.0 | 186 | 2.8950 | 1.0 | 1.0 |
2.9976 | 12.97 | 201 | 2.9169 | 1.0 | 1.0 |
2.9976 | 14.0 | 217 | 2.8821 | 1.0 | 1.0 |
2.9976 | 14.97 | 232 | 2.8835 | 1.0 | 1.0 |
2.9976 | 16.0 | 248 | 2.8722 | 1.0 | 1.0 |
2.9976 | 16.97 | 263 | 2.8790 | 1.0 | 1.0 |
2.9976 | 18.0 | 279 | 2.8685 | 1.0 | 1.0 |
2.9976 | 18.97 | 294 | 2.8677 | 1.0 | 1.0 |
2.8998 | 20.0 | 310 | 2.8455 | 1.0 | 1.0 |
2.8998 | 20.97 | 325 | 2.8157 | 1.0 | 1.0 |
2.8998 | 22.0 | 341 | 2.7640 | 1.0 | 1.0 |
2.8998 | 22.97 | 356 | 2.7123 | 1.0 | 1.0 |
2.8998 | 24.0 | 372 | 2.5909 | 1.0 | 1.0 |
2.8998 | 24.97 | 387 | 2.4481 | 1.0 | 0.9981 |
2.7524 | 26.0 | 403 | 2.2849 | 0.9979 | 0.9672 |
2.7524 | 26.97 | 418 | 1.9159 | 1.0 | 0.6426 |
2.7524 | 28.0 | 434 | 1.5804 | 1.0 | 0.4751 |
2.7524 | 28.97 | 449 | 1.2506 | 0.9872 | 0.2888 |
2.7524 | 30.0 | 465 | 1.0009 | 0.9097 | 0.2155 |
2.7524 | 30.97 | 480 | 0.8349 | 0.7651 | 0.1665 |
2.7524 | 32.0 | 496 | 0.6778 | 0.4888 | 0.1063 |
1.736 | 32.97 | 511 | 0.5565 | 0.2986 | 0.0684 |
1.736 | 34.0 | 527 | 0.4758 | 0.2221 | 0.0520 |
1.736 | 34.97 | 542 | 0.4216 | 0.1945 | 0.0461 |
1.736 | 36.0 | 558 | 0.3800 | 0.1753 | 0.0423 |
1.736 | 36.97 | 573 | 0.3516 | 0.1509 | 0.0379 |
1.736 | 38.0 | 589 | 0.3243 | 0.1392 | 0.0358 |
0.7956 | 38.97 | 604 | 0.3016 | 0.1339 | 0.0349 |
0.7956 | 40.0 | 620 | 0.2845 | 0.1286 | 0.0337 |
0.7956 | 40.97 | 635 | 0.2741 | 0.1328 | 0.0347 |
0.7956 | 42.0 | 651 | 0.2595 | 0.1296 | 0.0343 |
0.7956 | 42.97 | 666 | 0.2460 | 0.1169 | 0.0320 |
0.7956 | 44.0 | 682 | 0.2379 | 0.1318 | 0.0337 |
0.7956 | 44.97 | 697 | 0.2306 | 0.1275 | 0.0345 |
0.5311 | 46.0 | 713 | 0.2270 | 0.1201 | 0.0324 |
0.5311 | 46.97 | 728 | 0.2220 | 0.1233 | 0.0333 |
0.5311 | 48.0 | 744 | 0.2141 | 0.1211 | 0.0333 |
0.5311 | 48.97 | 759 | 0.2100 | 0.1275 | 0.0333 |
0.5311 | 50.0 | 775 | 0.2012 | 0.1201 | 0.0324 |
0.5311 | 50.97 | 790 | 0.1962 | 0.1084 | 0.0297 |
0.4114 | 52.0 | 806 | 0.1927 | 0.1084 | 0.0299 |
0.4114 | 52.97 | 821 | 0.1862 | 0.1052 | 0.0295 |
0.4114 | 54.0 | 837 | 0.1839 | 0.0946 | 0.0269 |
0.4114 | 54.97 | 852 | 0.1804 | 0.0871 | 0.0253 |
0.4114 | 56.0 | 868 | 0.1776 | 0.0882 | 0.0257 |
0.4114 | 56.97 | 883 | 0.1735 | 0.0882 | 0.0250 |
0.4114 | 58.0 | 899 | 0.1714 | 0.0882 | 0.0253 |
0.3699 | 58.97 | 914 | 0.1704 | 0.0850 | 0.0242 |
0.3699 | 60.0 | 930 | 0.1674 | 0.0850 | 0.0242 |
0.3699 | 60.97 | 945 | 0.1657 | 0.0871 | 0.0250 |
0.3699 | 62.0 | 961 | 0.1648 | 0.0840 | 0.0250 |
0.3699 | 62.97 | 976 | 0.1617 | 0.0808 | 0.0240 |
0.3699 | 64.0 | 992 | 0.1600 | 0.0818 | 0.0238 |
0.3423 | 64.97 | 1007 | 0.1581 | 0.0797 | 0.0232 |
0.3423 | 66.0 | 1023 | 0.1548 | 0.0797 | 0.0229 |
0.3423 | 66.97 | 1038 | 0.1547 | 0.0797 | 0.0242 |
0.3423 | 68.0 | 1054 | 0.1523 | 0.0808 | 0.0238 |
0.3423 | 68.97 | 1069 | 0.1518 | 0.0808 | 0.0246 |
0.3423 | 70.0 | 1085 | 0.1490 | 0.0786 | 0.0242 |
0.3112 | 70.97 | 1100 | 0.1482 | 0.0797 | 0.0238 |
0.3112 | 72.0 | 1116 | 0.1471 | 0.0776 | 0.0236 |
0.3112 | 72.97 | 1131 | 0.1480 | 0.0797 | 0.0244 |
0.3112 | 74.0 | 1147 | 0.1456 | 0.0776 | 0.0236 |
0.3112 | 74.97 | 1162 | 0.1470 | 0.0818 | 0.0242 |
0.3112 | 76.0 | 1178 | 0.1453 | 0.0744 | 0.0227 |
0.3112 | 76.97 | 1193 | 0.1441 | 0.0797 | 0.0236 |
0.2676 | 78.0 | 1209 | 0.1435 | 0.0755 | 0.0225 |
0.2676 | 78.97 | 1224 | 0.1426 | 0.0755 | 0.0229 |
0.2676 | 80.0 | 1240 | 0.1410 | 0.0723 | 0.0219 |
0.2676 | 80.97 | 1255 | 0.1405 | 0.0755 | 0.0225 |
0.2676 | 82.0 | 1271 | 0.1400 | 0.0765 | 0.0227 |
0.2676 | 82.97 | 1286 | 0.1400 | 0.0765 | 0.0227 |
0.2801 | 84.0 | 1302 | 0.1396 | 0.0755 | 0.0232 |
0.2801 | 84.97 | 1317 | 0.1398 | 0.0744 | 0.0227 |
0.2801 | 86.0 | 1333 | 0.1388 | 0.0744 | 0.0229 |
0.2801 | 86.97 | 1348 | 0.1392 | 0.0755 | 0.0229 |
0.2801 | 88.0 | 1364 | 0.1382 | 0.0744 | 0.0232 |
0.2801 | 88.97 | 1379 | 0.1377 | 0.0744 | 0.0227 |
0.2801 | 90.0 | 1395 | 0.1374 | 0.0744 | 0.0227 |
0.2496 | 90.97 | 1410 | 0.1375 | 0.0733 | 0.0225 |
0.2496 | 92.0 | 1426 | 0.1370 | 0.0733 | 0.0225 |
0.2496 | 92.97 | 1441 | 0.1379 | 0.0733 | 0.0223 |
0.2496 | 94.0 | 1457 | 0.1377 | 0.0733 | 0.0223 |
0.2496 | 94.97 | 1472 | 0.1377 | 0.0744 | 0.0227 |
0.2496 | 96.0 | 1488 | 0.1378 | 0.0744 | 0.0227 |
0.2729 | 96.77 | 1500 | 0.1377 | 0.0755 | 0.0229 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3