generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

wav2vec2-large-xlsr-mecita-coraa-portuguese-all-05

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Wer Cer
28.1646 1.0 86 4.8610 0.9832 0.9803
9.4454 2.0 172 3.3327 0.9567 0.9034
4.3702 3.0 258 3.1962 1.0 1.0
4.398 4.0 344 2.9906 1.0 1.0
3.7462 5.0 430 2.9604 1.0 1.0
3.9079 6.0 516 2.9447 0.9659 0.9651
3.9602 7.0 602 2.8960 0.9591 0.9286
3.9602 8.0 688 2.8418 0.9555 0.8580
3.7183 9.0 774 2.8463 0.9516 0.8567
3.7229 10.0 860 2.9000 0.9706 0.8790
3.7056 11.0 946 2.7891 0.9747 0.8368
4.3909 12.0 1032 2.8128 0.9681 0.8091
3.2494 13.0 1118 2.7410 0.9776 0.7792
4.1417 14.0 1204 2.7921 0.9781 0.8052
4.1417 15.0 1290 2.7801 0.9720 0.7893
3.5813 16.0 1376 2.7245 0.9735 0.7797
3.7739 17.0 1462 2.7206 0.9633 0.7592
3.2666 18.0 1548 2.6764 0.9786 0.7518
3.5191 19.0 1634 2.6704 0.9771 0.7560
4.0333 20.0 1720 2.6269 0.9757 0.7216
3.155 21.0 1806 2.6349 0.9710 0.7388
3.155 22.0 1892 2.7308 0.9557 0.7501
3.4096 23.0 1978 2.5728 0.9727 0.7167
3.6189 24.0 2064 2.6209 0.9706 0.7134
3.1681 25.0 2150 2.5653 0.9662 0.7091
2.851 26.0 2236 2.5492 0.9601 0.7133
3.0625 27.0 2322 2.4696 0.9584 0.7059
2.6773 28.0 2408 2.4480 0.9708 0.6835
2.6773 29.0 2494 2.4551 0.9676 0.6831
2.6704 30.0 2580 2.3923 0.9615 0.6865
2.9334 31.0 2666 2.3834 0.9620 0.6748
2.8201 32.0 2752 2.3488 0.9615 0.6599
2.4969 33.0 2838 2.2633 0.9637 0.6578
2.5507 34.0 2924 2.2478 0.9744 0.6386
2.4824 35.0 3010 2.1524 0.9589 0.6350
2.4824 36.0 3096 2.1081 0.9523 0.6110
2.3294 37.0 3182 2.0454 0.9486 0.6230
2.2776 38.0 3268 1.9534 0.9435 0.5944
2.2266 39.0 3354 1.8659 0.9389 0.5765
2.2176 40.0 3440 1.8386 0.9277 0.5618
2.1509 41.0 3526 1.7009 0.9031 0.5146
2.0032 42.0 3612 1.5778 0.8936 0.5060
2.0032 43.0 3698 1.4811 0.8825 0.4663
1.8864 44.0 3784 1.2900 0.8576 0.4236
1.8023 45.0 3870 1.1809 0.7851 0.3400
1.7414 46.0 3956 1.0532 0.7802 0.3238
1.5595 47.0 4042 0.9481 0.7206 0.2759
1.4958 48.0 4128 0.8725 0.6960 0.2557
1.3644 49.0 4214 0.8044 0.6486 0.2255
1.4496 50.0 4300 0.6878 0.6074 0.1988
1.4496 51.0 4386 0.6003 0.5330 0.1622
1.0768 52.0 4472 0.5278 0.4609 0.1385
0.9745 53.0 4558 0.4708 0.3923 0.1158
0.8776 54.0 4644 0.4379 0.3558 0.1038
0.8135 55.0 4730 0.4097 0.3234 0.0945
0.7573 56.0 4816 0.3926 0.3088 0.0874
0.6982 57.0 4902 0.3704 0.2801 0.0801
0.6982 58.0 4988 0.3488 0.2587 0.0748
0.6478 59.0 5074 0.3264 0.2541 0.0710
0.6391 60.0 5160 0.3236 0.2314 0.0664
0.5856 61.0 5246 0.3106 0.2283 0.0644
0.5626 62.0 5332 0.3044 0.2176 0.0621
0.5698 63.0 5418 0.2993 0.2127 0.0601
0.5238 64.0 5504 0.2926 0.2039 0.0585
0.5238 65.0 5590 0.2852 0.2047 0.0577
0.4932 66.0 5676 0.2698 0.1932 0.0558
0.446 67.0 5762 0.2602 0.1932 0.0541
0.464 68.0 5848 0.2590 0.1852 0.0531
0.4482 69.0 5934 0.2508 0.1813 0.0520
0.4378 70.0 6020 0.2578 0.1691 0.0500
0.4682 71.0 6106 0.2496 0.1757 0.0505
0.4682 72.0 6192 0.2445 0.1713 0.0492
0.4296 73.0 6278 0.2445 0.1640 0.0483
0.4032 74.0 6364 0.2406 0.1640 0.0485
0.4357 75.0 6450 0.2398 0.1643 0.0484
0.4126 76.0 6536 0.2398 0.1611 0.0480
0.4006 77.0 6622 0.2343 0.1572 0.0468
0.3973 78.0 6708 0.2310 0.1531 0.0464
0.3973 79.0 6794 0.2368 0.1521 0.0459
0.3795 80.0 6880 0.2318 0.1523 0.0457
0.3926 81.0 6966 0.2287 0.1509 0.0455
0.3836 82.0 7052 0.2277 0.1519 0.0456
0.3746 83.0 7138 0.2245 0.1497 0.0445
0.3499 84.0 7224 0.2234 0.1514 0.0445
0.3572 85.0 7310 0.2234 0.1460 0.0444
0.3572 86.0 7396 0.2207 0.1470 0.0442
0.3358 87.0 7482 0.2186 0.1431 0.0441
0.3686 88.0 7568 0.2199 0.1487 0.0444
0.3572 89.0 7654 0.2194 0.1441 0.0442
0.3535 90.0 7740 0.2191 0.1453 0.0443
0.334 91.0 7826 0.2181 0.1443 0.0441
0.3153 92.0 7912 0.2177 0.1436 0.0438
0.3153 93.0 7998 0.2174 0.1412 0.0434
0.33 94.0 8084 0.2182 0.1455 0.0436
0.3167 95.0 8170 0.2181 0.1429 0.0432
0.3217 96.0 8256 0.2174 0.1429 0.0430
0.31 97.0 8342 0.2180 0.1394 0.0428
0.3307 98.0 8428 0.2179 0.1424 0.0431
0.3392 99.0 8514 0.2182 0.1412 0.0429
0.3468 100.0 8600 0.2177 0.1399 0.0430

Framework versions