<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-02
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3867
- Wer: 0.2241
- Cer: 0.0617
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
26.8614 | 1.0 | 86 | 4.1075 | 1.0 | 1.0 |
9.1211 | 2.0 | 172 | 3.3065 | 0.9589 | 0.9337 |
4.4259 | 3.0 | 258 | 3.1101 | 0.9936 | 0.9963 |
4.0065 | 4.0 | 344 | 3.1175 | 1.0 | 1.0 |
4.1425 | 5.0 | 430 | 2.9843 | 1.0 | 1.0 |
3.5388 | 6.0 | 516 | 2.9247 | 0.9816 | 0.9890 |
4.1389 | 7.0 | 602 | 2.9091 | 0.9560 | 0.9350 |
4.1389 | 8.0 | 688 | 2.9170 | 0.9550 | 0.9197 |
3.748 | 9.0 | 774 | 2.8611 | 0.9567 | 0.8971 |
3.8952 | 10.0 | 860 | 2.8640 | 0.9616 | 0.8726 |
3.6117 | 11.0 | 946 | 2.8944 | 0.9684 | 0.8691 |
4.1201 | 12.0 | 1032 | 2.8589 | 0.9704 | 0.8534 |
3.8543 | 13.0 | 1118 | 2.8410 | 0.9748 | 0.8257 |
3.8936 | 14.0 | 1204 | 2.7883 | 0.9763 | 0.7919 |
3.8936 | 15.0 | 1290 | 2.7922 | 0.9660 | 0.7798 |
3.5855 | 16.0 | 1376 | 2.7786 | 0.9814 | 0.7712 |
3.9366 | 17.0 | 1462 | 2.7948 | 0.9670 | 0.7707 |
3.4395 | 18.0 | 1548 | 2.7546 | 0.9533 | 0.7897 |
3.66 | 19.0 | 1634 | 2.7427 | 0.9701 | 0.7759 |
3.5659 | 20.0 | 1720 | 2.7296 | 0.9665 | 0.7677 |
3.3732 | 21.0 | 1806 | 2.6593 | 0.9785 | 0.7440 |
3.3732 | 22.0 | 1892 | 2.6909 | 0.9648 | 0.7411 |
3.6926 | 23.0 | 1978 | 2.6636 | 0.9697 | 0.7353 |
3.5373 | 24.0 | 2064 | 2.6966 | 0.9670 | 0.7250 |
3.53 | 25.0 | 2150 | 2.6241 | 0.9741 | 0.7176 |
3.3077 | 26.0 | 2236 | 2.6299 | 0.9628 | 0.7247 |
3.0885 | 27.0 | 2322 | 2.6890 | 0.9697 | 0.7280 |
3.1043 | 28.0 | 2408 | 2.6306 | 0.9594 | 0.7102 |
3.1043 | 29.0 | 2494 | 2.5748 | 0.9672 | 0.6988 |
3.0444 | 30.0 | 2580 | 2.5896 | 0.9682 | 0.6999 |
2.8768 | 31.0 | 2666 | 2.5596 | 0.9746 | 0.6906 |
2.9169 | 32.0 | 2752 | 2.5224 | 0.9675 | 0.6819 |
2.5564 | 33.0 | 2838 | 2.4848 | 0.9645 | 0.6876 |
2.8245 | 34.0 | 2924 | 2.4788 | 0.9621 | 0.6831 |
2.5474 | 35.0 | 3010 | 2.4651 | 0.9660 | 0.6664 |
2.5474 | 36.0 | 3096 | 2.4165 | 0.9540 | 0.6799 |
2.7496 | 37.0 | 3182 | 2.3682 | 0.9621 | 0.6593 |
2.5884 | 38.0 | 3268 | 2.3454 | 0.9591 | 0.6501 |
2.5919 | 39.0 | 3354 | 2.3438 | 0.9633 | 0.6536 |
2.686 | 40.0 | 3440 | 2.2963 | 0.9520 | 0.6457 |
2.3805 | 41.0 | 3526 | 2.2551 | 0.9599 | 0.6387 |
2.5631 | 42.0 | 3612 | 2.2398 | 0.9530 | 0.6330 |
2.5631 | 43.0 | 3698 | 2.1721 | 0.9489 | 0.6122 |
2.3876 | 44.0 | 3784 | 2.2873 | 0.9476 | 0.6192 |
2.4841 | 45.0 | 3870 | 2.0955 | 0.9457 | 0.6164 |
2.4018 | 46.0 | 3956 | 2.0725 | 0.9381 | 0.6160 |
2.2015 | 47.0 | 4042 | 2.1368 | 0.9369 | 0.5877 |
2.181 | 48.0 | 4128 | 2.0029 | 0.9327 | 0.5861 |
2.1738 | 49.0 | 4214 | 2.0337 | 0.9435 | 0.6281 |
2.2699 | 50.0 | 4300 | 1.8319 | 0.9320 | 0.5501 |
2.2699 | 51.0 | 4386 | 1.7263 | 0.9068 | 0.5057 |
1.9857 | 52.0 | 4472 | 1.6162 | 0.8992 | 0.4963 |
1.8812 | 53.0 | 4558 | 1.4509 | 0.8505 | 0.4150 |
1.8218 | 54.0 | 4644 | 1.3207 | 0.8375 | 0.3834 |
1.696 | 55.0 | 4730 | 1.2113 | 0.7827 | 0.3306 |
1.613 | 56.0 | 4816 | 1.1816 | 0.7379 | 0.2941 |
1.4016 | 57.0 | 4902 | 1.0539 | 0.7000 | 0.2565 |
1.4016 | 58.0 | 4988 | 0.9564 | 0.6472 | 0.2200 |
1.3268 | 59.0 | 5074 | 0.9172 | 0.6144 | 0.2026 |
1.2608 | 60.0 | 5160 | 0.7483 | 0.5664 | 0.1802 |
1.1512 | 61.0 | 5246 | 0.7947 | 0.5270 | 0.1609 |
1.1152 | 62.0 | 5332 | 0.7427 | 0.4965 | 0.1496 |
1.0352 | 63.0 | 5418 | 0.7145 | 0.4593 | 0.1372 |
1.0375 | 64.0 | 5504 | 0.5751 | 0.4309 | 0.1266 |
1.0375 | 65.0 | 5590 | 0.5847 | 0.4426 | 0.1270 |
0.9703 | 66.0 | 5676 | 0.6924 | 0.4177 | 0.1214 |
0.9189 | 67.0 | 5762 | 0.6237 | 0.3802 | 0.1097 |
0.9436 | 68.0 | 5848 | 0.6301 | 0.3531 | 0.1037 |
0.923 | 69.0 | 5934 | 0.5813 | 0.3394 | 0.0968 |
0.8563 | 70.0 | 6020 | 0.5515 | 0.3323 | 0.0923 |
0.805 | 71.0 | 6106 | 0.4729 | 0.3171 | 0.0888 |
0.805 | 72.0 | 6192 | 0.5380 | 0.3017 | 0.0840 |
0.7711 | 73.0 | 6278 | 0.4278 | 0.2924 | 0.0819 |
0.892 | 74.0 | 6364 | 0.5463 | 0.2765 | 0.0780 |
0.7319 | 75.0 | 6450 | 0.5149 | 0.2782 | 0.0771 |
0.7468 | 76.0 | 6536 | 0.5249 | 0.2674 | 0.0754 |
0.7193 | 77.0 | 6622 | 0.5031 | 0.2542 | 0.0720 |
0.6805 | 78.0 | 6708 | 0.3962 | 0.2471 | 0.0692 |
0.6805 | 79.0 | 6794 | 0.5201 | 0.2498 | 0.0707 |
0.6853 | 80.0 | 6880 | 0.5131 | 0.2408 | 0.0676 |
0.6965 | 81.0 | 6966 | 0.4295 | 0.2359 | 0.0658 |
0.5952 | 82.0 | 7052 | 0.4917 | 0.2388 | 0.0672 |
0.6305 | 83.0 | 7138 | 0.4658 | 0.2356 | 0.0664 |
0.6792 | 84.0 | 7224 | 0.4270 | 0.2246 | 0.0639 |
0.6227 | 85.0 | 7310 | 0.4015 | 0.2266 | 0.0629 |
0.6227 | 86.0 | 7396 | 0.3867 | 0.2241 | 0.0617 |
0.6235 | 87.0 | 7482 | 0.4972 | 0.2251 | 0.0627 |
0.6418 | 88.0 | 7568 | 0.5182 | 0.2312 | 0.0638 |
0.5944 | 89.0 | 7654 | 0.5077 | 0.2224 | 0.0623 |
0.6003 | 90.0 | 7740 | 0.5102 | 0.2207 | 0.0617 |
0.6756 | 91.0 | 7826 | 0.4906 | 0.2136 | 0.0599 |
0.6084 | 92.0 | 7912 | 0.4829 | 0.2200 | 0.0614 |
0.6084 | 93.0 | 7998 | 0.4089 | 0.2143 | 0.0600 |
0.5737 | 94.0 | 8084 | 0.4178 | 0.2165 | 0.0601 |
0.594 | 95.0 | 8170 | 0.3984 | 0.2163 | 0.0602 |
0.5532 | 96.0 | 8256 | 0.4473 | 0.2143 | 0.0600 |
0.5858 | 97.0 | 8342 | 0.4752 | 0.2165 | 0.0604 |
0.5568 | 98.0 | 8428 | 0.4746 | 0.2146 | 0.0602 |
0.5664 | 99.0 | 8514 | 0.4429 | 0.2114 | 0.0593 |
0.6462 | 100.0 | 8600 | 0.4549 | 0.2143 | 0.0597 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3