generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

wav2vec2-large-xlsr-mecita-coraa-portuguese-clean-grade-2-4

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Wer Cer
30.5444 1.0 36 6.0803 1.0 1.0
30.5444 2.0 72 4.6965 1.0 1.0
10.2305 3.0 108 3.6812 1.0 1.0
10.2305 4.0 144 3.4078 0.965 0.9537
10.2305 5.0 180 3.2125 0.95 0.8998
4.437 6.0 216 3.1525 0.9492 0.9048
4.437 7.0 252 3.2147 0.9492 0.9146
4.437 8.0 288 3.0222 0.9508 0.8935
3.8259 9.0 324 2.9467 1.0 1.0
3.8259 10.0 360 2.9534 1.0 1.0
3.8259 11.0 396 2.8863 1.0 1.0
3.8025 12.0 432 2.8885 1.0 1.0
3.8025 13.0 468 2.8496 0.9992 0.9997
3.8906 14.0 504 2.8937 0.9967 0.9987
3.8906 15.0 540 2.8874 0.9717 0.9843
3.8906 16.0 576 2.8432 0.97 0.9739
3.9656 17.0 612 2.8498 0.9733 0.9747
3.9656 18.0 648 2.8246 0.9483 0.9380
3.9656 19.0 684 2.8476 0.955 0.9157
3.7122 20.0 720 2.8487 0.9608 0.9320
3.7122 21.0 756 2.9476 0.96 0.9413
3.7122 22.0 792 2.8280 0.9575 0.9146
3.4393 23.0 828 2.8241 0.9508 0.8927
3.4393 24.0 864 2.8278 0.965 0.9207
3.7183 25.0 900 2.8019 0.9575 0.9076
3.7183 26.0 936 2.7846 0.9525 0.8977
3.7183 27.0 972 2.8069 0.9642 0.8955
3.4876 28.0 1008 2.7941 0.96 0.8973
3.4876 29.0 1044 2.8143 0.9683 0.8711
3.4876 30.0 1080 2.7805 0.9683 0.8672
3.7973 31.0 1116 2.7645 0.9658 0.8532
3.7973 32.0 1152 2.7732 0.9742 0.8534
3.7973 33.0 1188 2.7968 0.9733 0.8537
3.4257 34.0 1224 2.7651 0.975 0.8587
3.4257 35.0 1260 2.7847 0.98 0.8449
3.4257 36.0 1296 2.7631 0.9817 0.8409
3.6149 37.0 1332 2.7716 0.9708 0.8436
3.6149 38.0 1368 2.7699 0.98 0.8302
3.5118 39.0 1404 2.7433 0.9792 0.8178
3.5118 40.0 1440 2.7654 0.9725 0.8181
3.5118 41.0 1476 2.7458 0.9817 0.8104
3.5272 42.0 1512 2.7577 0.9858 0.8121
3.5272 43.0 1548 2.7280 0.9933 0.8180
3.5272 44.0 1584 2.7223 0.9808 0.8140
3.3239 45.0 1620 2.7348 0.9842 0.8044
3.3239 46.0 1656 2.7225 0.9867 0.8044
3.3239 47.0 1692 2.7640 0.9875 0.7942
3.4254 48.0 1728 2.7388 0.9833 0.7955
3.4254 49.0 1764 2.7163 0.9867 0.7964
3.2168 50.0 1800 2.7176 0.99 0.7835
3.2168 51.0 1836 2.7010 0.9833 0.7896
3.2168 52.0 1872 2.7141 0.9867 0.7870
3.1638 53.0 1908 2.7013 0.985 0.7846
3.1638 54.0 1944 2.7287 0.9875 0.7917
3.1638 55.0 1980 2.6886 0.9892 0.7937
3.0805 56.0 2016 2.6875 0.9892 0.7793
3.0805 57.0 2052 2.7298 0.99 0.7876
3.0805 58.0 2088 2.7506 0.985 0.7829
3.1154 59.0 2124 2.6963 0.9892 0.7925
3.1154 60.0 2160 2.7002 0.9858 0.7823
3.1154 61.0 2196 2.6888 0.985 0.7819
2.9493 62.0 2232 2.7109 0.9825 0.7870
2.9493 63.0 2268 2.7069 0.9842 0.7780
2.8656 64.0 2304 2.7332 0.9842 0.7778
2.8656 65.0 2340 2.6759 0.9858 0.7841
2.8656 66.0 2376 2.6570 0.9858 0.7772
2.7412 67.0 2412 2.6872 0.9875 0.7659
2.7412 68.0 2448 2.7655 0.9817 0.7716
2.7412 69.0 2484 2.7470 0.98 0.7615
2.7649 70.0 2520 2.7192 0.9842 0.7736
2.7649 71.0 2556 2.6822 0.9792 0.7662
2.7649 72.0 2592 2.7063 0.9808 0.7671
2.7153 73.0 2628 2.7062 0.9792 0.7651
2.7153 74.0 2664 2.6431 0.9858 0.7706
2.6769 75.0 2700 2.6509 0.9892 0.7657
2.6769 76.0 2736 2.6543 0.985 0.7643
2.6769 77.0 2772 2.6779 0.9775 0.7646
2.6723 78.0 2808 2.6765 0.9833 0.7640
2.6723 79.0 2844 2.6687 0.985 0.7572
2.6723 80.0 2880 2.6857 0.9842 0.7634
2.7001 81.0 2916 2.6677 0.9858 0.7565
2.7001 82.0 2952 2.6569 0.9867 0.7538
2.7001 83.0 2988 2.6908 0.98 0.7538
2.6632 84.0 3024 2.6932 0.9858 0.7517
2.6632 85.0 3060 2.6426 0.9867 0.7546
2.6632 86.0 3096 2.6464 0.9825 0.7587
2.6488 87.0 3132 2.6865 0.9833 0.7602
2.6488 88.0 3168 2.6863 0.985 0.7577
2.7161 89.0 3204 2.6651 0.985 0.7546
2.7161 90.0 3240 2.6587 0.9825 0.7549
2.7161 91.0 3276 2.6709 0.9808 0.7552
2.6518 92.0 3312 2.6723 0.9883 0.7546
2.6518 93.0 3348 2.6523 0.985 0.7527
2.6518 94.0 3384 2.6503 0.9842 0.7514
2.6423 95.0 3420 2.6730 0.985 0.7544
2.6423 96.0 3456 2.6780 0.9917 0.7519
2.6423 97.0 3492 2.6833 0.9875 0.7547
2.762 98.0 3528 2.6739 0.9842 0.7509
2.762 99.0 3564 2.6718 0.9833 0.7511
2.6423 100.0 3600 2.6695 0.98 0.7530

Framework versions