<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-grade-5
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1435
- Wer: 0.0955
- Cer: 0.0274
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
15.4993 | 0.97 | 17 | 8.4020 | 1.0 | 1.0 |
15.4993 | 2.0 | 35 | 3.9299 | 1.0 | 1.0 |
15.4993 | 2.97 | 52 | 3.3997 | 1.0 | 1.0 |
15.4993 | 4.0 | 70 | 3.1324 | 1.0 | 1.0 |
15.4993 | 4.97 | 87 | 3.0348 | 1.0 | 1.0 |
5.4134 | 6.0 | 105 | 2.9791 | 1.0 | 1.0 |
5.4134 | 6.97 | 122 | 2.9259 | 1.0 | 1.0 |
5.4134 | 8.0 | 140 | 2.9142 | 1.0 | 1.0 |
5.4134 | 8.97 | 157 | 2.9152 | 1.0 | 1.0 |
5.4134 | 10.0 | 175 | 2.8822 | 1.0 | 1.0 |
5.4134 | 10.97 | 192 | 2.8981 | 1.0 | 1.0 |
2.9248 | 12.0 | 210 | 2.8687 | 1.0 | 1.0 |
2.9248 | 12.97 | 227 | 2.8764 | 1.0 | 1.0 |
2.9248 | 14.0 | 245 | 2.8840 | 1.0 | 1.0 |
2.9248 | 14.97 | 262 | 2.8627 | 1.0 | 1.0 |
2.9248 | 16.0 | 280 | 2.8542 | 1.0 | 1.0 |
2.9248 | 16.97 | 297 | 2.8590 | 1.0 | 1.0 |
2.8726 | 18.0 | 315 | 2.8482 | 1.0 | 1.0 |
2.8726 | 18.97 | 332 | 2.8265 | 1.0 | 1.0 |
2.8726 | 20.0 | 350 | 2.7943 | 1.0 | 1.0 |
2.8726 | 20.97 | 367 | 2.7267 | 1.0 | 1.0 |
2.8726 | 22.0 | 385 | 2.6505 | 1.0 | 1.0 |
2.7916 | 22.97 | 402 | 2.4963 | 1.0 | 0.9982 |
2.7916 | 24.0 | 420 | 2.2866 | 0.9984 | 0.9120 |
2.7916 | 24.97 | 437 | 1.9284 | 1.0 | 0.6639 |
2.7916 | 26.0 | 455 | 1.4131 | 0.8482 | 0.2764 |
2.7916 | 26.97 | 472 | 1.0618 | 0.5869 | 0.1578 |
2.7916 | 28.0 | 490 | 0.8013 | 0.4049 | 0.1045 |
1.9354 | 28.97 | 507 | 0.6589 | 0.3273 | 0.0814 |
1.9354 | 30.0 | 525 | 0.5627 | 0.2971 | 0.0732 |
1.9354 | 30.97 | 542 | 0.4904 | 0.2727 | 0.0655 |
1.9354 | 32.0 | 560 | 0.4381 | 0.2604 | 0.0637 |
1.9354 | 32.97 | 577 | 0.3957 | 0.2384 | 0.0568 |
1.9354 | 34.0 | 595 | 0.3693 | 0.2343 | 0.0574 |
0.8461 | 34.97 | 612 | 0.3469 | 0.2237 | 0.0550 |
0.8461 | 36.0 | 630 | 0.3212 | 0.1976 | 0.0501 |
0.8461 | 36.97 | 647 | 0.3015 | 0.1894 | 0.0493 |
0.8461 | 38.0 | 665 | 0.2918 | 0.1698 | 0.0448 |
0.8461 | 38.97 | 682 | 0.2780 | 0.1633 | 0.0435 |
0.535 | 40.0 | 700 | 0.2631 | 0.1543 | 0.0411 |
0.535 | 40.97 | 717 | 0.2517 | 0.1535 | 0.0399 |
0.535 | 42.0 | 735 | 0.2428 | 0.1469 | 0.0381 |
0.535 | 42.97 | 752 | 0.2337 | 0.1527 | 0.0394 |
0.535 | 44.0 | 770 | 0.2260 | 0.1461 | 0.0390 |
0.535 | 44.97 | 787 | 0.2244 | 0.1486 | 0.0387 |
0.4705 | 46.0 | 805 | 0.2173 | 0.1412 | 0.0378 |
0.4705 | 46.97 | 822 | 0.2105 | 0.1363 | 0.0364 |
0.4705 | 48.0 | 840 | 0.2046 | 0.1347 | 0.0358 |
0.4705 | 48.97 | 857 | 0.2035 | 0.1322 | 0.0348 |
0.4705 | 50.0 | 875 | 0.1967 | 0.1216 | 0.0327 |
0.4705 | 50.97 | 892 | 0.1938 | 0.1151 | 0.0313 |
0.3641 | 52.0 | 910 | 0.1880 | 0.1151 | 0.0310 |
0.3641 | 52.97 | 927 | 0.1852 | 0.1102 | 0.0301 |
0.3641 | 54.0 | 945 | 0.1818 | 0.1086 | 0.0298 |
0.3641 | 54.97 | 962 | 0.1778 | 0.1135 | 0.0298 |
0.3641 | 56.0 | 980 | 0.1800 | 0.1102 | 0.0303 |
0.3641 | 56.97 | 997 | 0.1759 | 0.1094 | 0.0295 |
0.3406 | 58.0 | 1015 | 0.1706 | 0.1110 | 0.0297 |
0.3406 | 58.97 | 1032 | 0.1684 | 0.1061 | 0.0286 |
0.3406 | 60.0 | 1050 | 0.1679 | 0.1037 | 0.0289 |
0.3406 | 60.97 | 1067 | 0.1653 | 0.1037 | 0.0288 |
0.3406 | 62.0 | 1085 | 0.1660 | 0.1045 | 0.0295 |
0.302 | 62.97 | 1102 | 0.1622 | 0.0988 | 0.0280 |
0.302 | 64.0 | 1120 | 0.1635 | 0.1020 | 0.0285 |
0.302 | 64.97 | 1137 | 0.1624 | 0.1037 | 0.0288 |
0.302 | 66.0 | 1155 | 0.1613 | 0.1004 | 0.0286 |
0.302 | 66.97 | 1172 | 0.1602 | 0.1004 | 0.0288 |
0.302 | 68.0 | 1190 | 0.1587 | 0.1004 | 0.0288 |
0.2844 | 68.97 | 1207 | 0.1573 | 0.1045 | 0.0292 |
0.2844 | 70.0 | 1225 | 0.1565 | 0.1053 | 0.0291 |
0.2844 | 70.97 | 1242 | 0.1543 | 0.1037 | 0.0285 |
0.2844 | 72.0 | 1260 | 0.1542 | 0.1012 | 0.0280 |
0.2844 | 72.97 | 1277 | 0.1534 | 0.1012 | 0.0276 |
0.2844 | 74.0 | 1295 | 0.1527 | 0.0996 | 0.0279 |
0.253 | 74.97 | 1312 | 0.1528 | 0.1004 | 0.0280 |
0.253 | 76.0 | 1330 | 0.1521 | 0.0980 | 0.0279 |
0.253 | 76.97 | 1347 | 0.1505 | 0.0996 | 0.0276 |
0.253 | 78.0 | 1365 | 0.1492 | 0.0996 | 0.0273 |
0.253 | 78.97 | 1382 | 0.1485 | 0.0996 | 0.0273 |
0.2396 | 80.0 | 1400 | 0.1471 | 0.0996 | 0.0277 |
0.2396 | 80.97 | 1417 | 0.1464 | 0.0980 | 0.0277 |
0.2396 | 82.0 | 1435 | 0.1459 | 0.0996 | 0.0283 |
0.2396 | 82.97 | 1452 | 0.1460 | 0.0980 | 0.0273 |
0.2396 | 84.0 | 1470 | 0.1460 | 0.0996 | 0.0277 |
0.2396 | 84.97 | 1487 | 0.1456 | 0.0971 | 0.0277 |
0.2443 | 86.0 | 1505 | 0.1455 | 0.0980 | 0.0279 |
0.2443 | 86.97 | 1522 | 0.1448 | 0.0971 | 0.0282 |
0.2443 | 88.0 | 1540 | 0.1443 | 0.0988 | 0.0280 |
0.2443 | 88.97 | 1557 | 0.1446 | 0.0988 | 0.0279 |
0.2443 | 90.0 | 1575 | 0.1442 | 0.0996 | 0.0279 |
0.2443 | 90.97 | 1592 | 0.1440 | 0.0971 | 0.0277 |
0.2406 | 92.0 | 1610 | 0.1441 | 0.0947 | 0.0271 |
0.2406 | 92.97 | 1627 | 0.1440 | 0.0955 | 0.0273 |
0.2406 | 94.0 | 1645 | 0.1438 | 0.0955 | 0.0273 |
0.2406 | 94.97 | 1662 | 0.1436 | 0.0947 | 0.0273 |
0.2406 | 96.0 | 1680 | 0.1435 | 0.0955 | 0.0274 |
0.2406 | 96.97 | 1697 | 0.1435 | 0.0963 | 0.0276 |
0.2309 | 97.14 | 1700 | 0.1436 | 0.0947 | 0.0273 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3