generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

wav2vec2-large-xlsr-mecita-coraa-portuguese-clean-grade-5

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Wer Cer
15.7346 0.97 16 9.2489 1.0 1.0
15.7346 2.0 33 4.0322 1.0 1.0
15.7346 2.97 49 3.4177 1.0 1.0
15.7346 4.0 66 3.1683 1.0 1.0
15.7346 4.97 82 3.0383 1.0 1.0
15.7346 6.0 99 2.9783 1.0 1.0
5.3023 6.97 115 2.9396 1.0 1.0
5.3023 8.0 132 2.9159 1.0 1.0
5.3023 8.97 148 2.9004 1.0 1.0
5.3023 10.0 165 2.8862 1.0 1.0
5.3023 10.97 181 2.8701 1.0 1.0
5.3023 12.0 198 2.8703 1.0 1.0
2.9256 12.97 214 2.8601 1.0 1.0
2.9256 14.0 231 2.8912 1.0 1.0
2.9256 14.97 247 2.8999 1.0 1.0
2.9256 16.0 264 2.8535 1.0 1.0
2.9256 16.97 280 2.8630 1.0 1.0
2.9256 18.0 297 2.8580 1.0 1.0
2.8755 18.97 313 2.8584 1.0 1.0
2.8755 20.0 330 2.8444 1.0 1.0
2.8755 20.97 346 2.8458 1.0 1.0
2.8755 22.0 363 2.8415 1.0 1.0
2.8755 22.97 379 2.8386 1.0 1.0
2.8755 24.0 396 2.8486 1.0 1.0
2.86 24.97 412 2.8298 1.0 1.0
2.86 26.0 429 2.8202 1.0 1.0
2.86 26.97 445 2.7891 1.0 1.0
2.86 28.0 462 2.7456 1.0 1.0
2.86 28.97 478 2.6815 1.0 1.0
2.86 30.0 495 2.5857 1.0 1.0
2.792 30.97 511 2.4622 1.0 0.9998
2.792 32.0 528 2.2677 0.9989 0.8943
2.792 32.97 544 1.9538 1.0 0.6955
2.792 34.0 561 1.5407 1.0 0.4862
2.792 34.97 577 1.1644 0.8091 0.2150
2.792 36.0 594 0.8820 0.5432 0.1271
2.0829 36.97 610 0.6935 0.3648 0.0828
2.0829 38.0 627 0.5650 0.2614 0.0618
2.0829 38.97 643 0.4996 0.2352 0.0565
2.0829 40.0 660 0.4473 0.2057 0.0513
2.0829 40.97 676 0.4028 0.2023 0.0505
2.0829 42.0 693 0.3718 0.1807 0.0448
0.9316 42.97 709 0.3387 0.1830 0.0446
0.9316 44.0 726 0.3098 0.1761 0.0425
0.9316 44.97 742 0.2873 0.1648 0.0402
0.9316 46.0 759 0.2684 0.1602 0.0382
0.9316 46.97 775 0.2584 0.1523 0.0363
0.9316 48.0 792 0.2493 0.15 0.0363
0.5883 48.97 808 0.2373 0.1398 0.0349
0.5883 50.0 825 0.2287 0.1375 0.0347
0.5883 50.97 841 0.2239 0.1386 0.0341
0.5883 52.0 858 0.2143 0.1420 0.0345
0.5883 52.97 874 0.2091 0.1341 0.0337
0.5883 54.0 891 0.2031 0.1307 0.0324
0.4931 54.97 907 0.1941 0.1239 0.0308
0.4931 56.0 924 0.1872 0.1216 0.0306
0.4931 56.97 940 0.1862 0.1227 0.0318
0.4931 58.0 957 0.1850 0.1170 0.0306
0.4931 58.97 973 0.1824 0.1159 0.0320
0.4931 60.0 990 0.1790 0.1193 0.0306
0.4194 60.97 1006 0.1770 0.1193 0.0306
0.4194 62.0 1023 0.1735 0.1182 0.0298
0.4194 62.97 1039 0.1686 0.1170 0.0290
0.4194 64.0 1056 0.1648 0.1159 0.0294
0.4194 64.97 1072 0.1617 0.1091 0.0279
0.4194 66.0 1089 0.1576 0.1068 0.0281
0.3538 66.97 1105 0.1562 0.1045 0.0277
0.3538 68.0 1122 0.1548 0.1011 0.0277
0.3538 68.97 1138 0.1542 0.1045 0.0281
0.3538 70.0 1155 0.1528 0.1034 0.0285
0.3538 70.97 1171 0.1500 0.1023 0.0283
0.3538 72.0 1188 0.1479 0.0932 0.0265
0.313 72.97 1204 0.1461 0.0966 0.0271
0.313 74.0 1221 0.1465 0.0943 0.0267
0.313 74.97 1237 0.1438 0.0932 0.0263
0.313 76.0 1254 0.1433 0.0943 0.0263
0.313 76.97 1270 0.1421 0.0943 0.0253
0.313 78.0 1287 0.1407 0.0909 0.0257
0.3075 78.97 1303 0.1399 0.0875 0.0251
0.3075 80.0 1320 0.1378 0.0875 0.0248
0.3075 80.97 1336 0.1366 0.0886 0.0244
0.3075 82.0 1353 0.1376 0.0909 0.0257
0.3075 82.97 1369 0.1344 0.0864 0.0253
0.3075 84.0 1386 0.1348 0.0943 0.0261
0.2861 84.97 1402 0.1349 0.0955 0.0261
0.2861 86.0 1419 0.1339 0.0920 0.0255
0.2861 86.97 1435 0.1335 0.0943 0.0261
0.2861 88.0 1452 0.1330 0.0955 0.0259
0.2861 88.97 1468 0.1326 0.0955 0.0257
0.2861 90.0 1485 0.1322 0.0955 0.0257
0.2778 90.97 1501 0.1320 0.0943 0.0259
0.2778 92.0 1518 0.1329 0.0966 0.0263
0.2778 92.97 1534 0.1328 0.0955 0.0267
0.2778 94.0 1551 0.1329 0.0966 0.0265
0.2778 94.97 1567 0.1324 0.0943 0.0265
0.2778 96.0 1584 0.1319 0.0932 0.0263
0.3134 96.97 1600 0.1321 0.0932 0.0265

Framework versions