generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

wav2vec2-large-xlsr-mecita-coraa-portuguese-all-grade-4

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Wer Cer
13.7726 0.93 7 11.6190 0.9956 0.8278
13.7726 2.0 15 8.5051 1.0 1.0
13.7726 2.93 22 6.2165 1.0 1.0
13.7726 4.0 30 4.2514 1.0 1.0
13.7726 4.93 37 3.6613 1.0 1.0
13.7726 6.0 45 3.4125 1.0 1.0
13.7726 6.93 52 3.3004 1.0 1.0
13.7726 8.0 60 3.1753 1.0 1.0
13.7726 8.93 67 3.1030 1.0 1.0
13.7726 10.0 75 3.0753 1.0 1.0
13.7726 10.93 82 3.0253 1.0 1.0
13.7726 12.0 90 3.0037 1.0 1.0
13.7726 12.93 97 2.9845 1.0 1.0
5.0373 14.0 105 2.9577 1.0 1.0
5.0373 14.93 112 2.9440 1.0 1.0
5.0373 16.0 120 2.9336 1.0 1.0
5.0373 16.93 127 2.9166 1.0 1.0
5.0373 18.0 135 2.9040 1.0 1.0
5.0373 18.93 142 2.8956 1.0 1.0
5.0373 20.0 150 2.8844 1.0 1.0
5.0373 20.93 157 2.8762 1.0 1.0
5.0373 22.0 165 2.8821 1.0 1.0
5.0373 22.93 172 2.8660 1.0 1.0
5.0373 24.0 180 2.8739 1.0 1.0
5.0373 24.93 187 2.8632 1.0 1.0
5.0373 26.0 195 2.8580 1.0 1.0
2.8886 26.93 202 2.8565 1.0 1.0
2.8886 28.0 210 2.8717 1.0 1.0
2.8886 28.93 217 2.8518 1.0 1.0
2.8886 30.0 225 2.8849 1.0 1.0
2.8886 30.93 232 2.8578 1.0 1.0
2.8886 32.0 240 2.8782 1.0 1.0
2.8886 32.93 247 2.8465 1.0 1.0
2.8886 34.0 255 2.8644 1.0 1.0
2.8886 34.93 262 2.8438 1.0 1.0
2.8886 36.0 270 2.8466 1.0 1.0
2.8886 36.93 277 2.8473 1.0 1.0
2.8886 38.0 285 2.8414 1.0 1.0
2.8886 38.93 292 2.8444 1.0 1.0
2.831 40.0 300 2.8455 1.0 1.0
2.831 40.93 307 2.8357 1.0 1.0
2.831 42.0 315 2.8320 1.0 1.0
2.831 42.93 322 2.8415 1.0 1.0
2.831 44.0 330 2.8347 1.0 1.0
2.831 44.93 337 2.8386 1.0 1.0
2.831 46.0 345 2.8278 1.0 1.0
2.831 46.93 352 2.8324 1.0 1.0
2.831 48.0 360 2.8290 1.0 1.0
2.831 48.93 367 2.8319 1.0 1.0
2.831 50.0 375 2.8225 1.0 1.0
2.831 50.93 382 2.8048 1.0 1.0
2.831 52.0 390 2.8062 1.0 1.0
2.831 52.93 397 2.7941 1.0 1.0
2.8044 54.0 405 2.7786 1.0 0.9996
2.8044 54.93 412 2.7615 1.0 0.9993
2.8044 56.0 420 2.7564 1.0 0.9996
2.8044 56.93 427 2.7243 1.0 0.9982
2.8044 58.0 435 2.7148 1.0 0.9975
2.8044 58.93 442 2.6886 1.0 0.9942
2.8044 60.0 450 2.6476 1.0 0.9920
2.8044 60.93 457 2.6304 1.0 0.9946
2.8044 62.0 465 2.5871 1.0 0.9931
2.8044 62.93 472 2.5719 1.0 0.9939
2.8044 64.0 480 2.5189 1.0 0.9920
2.8044 64.93 487 2.4960 0.9978 0.9910
2.8044 66.0 495 2.4538 1.0 0.9808
2.6802 66.93 502 2.4214 1.0 0.9693
2.6802 68.0 510 2.3789 1.0 0.9407
2.6802 68.93 517 2.3391 0.9978 0.9212
2.6802 70.0 525 2.2928 1.0 0.8987
2.6802 70.93 532 2.2408 1.0 0.8499
2.6802 72.0 540 2.2057 1.0 0.8463
2.6802 72.93 547 2.1440 1.0 0.8047
2.6802 74.0 555 2.1055 1.0 0.7975
2.6802 74.93 562 2.0576 1.0 0.7729
2.6802 76.0 570 2.0157 1.0 0.7631
2.6802 76.93 577 1.9685 1.0 0.7418
2.6802 78.0 585 1.9267 1.0 0.7197
2.6802 78.93 592 1.8942 1.0 0.7114
2.3153 80.0 600 1.8437 1.0 0.6593
2.3153 80.93 607 1.8056 1.0 0.6159
2.3153 82.0 615 1.7832 1.0 0.6221
2.3153 82.93 622 1.7551 0.9978 0.5917
2.3153 84.0 630 1.7235 0.9956 0.5548
2.3153 84.93 637 1.7026 0.9956 0.5476
2.3153 86.0 645 1.6728 0.9934 0.5107
2.3153 86.93 652 1.6532 0.9934 0.4904
2.3153 88.0 660 1.6387 0.9934 0.4828
2.3153 88.93 667 1.6284 0.9934 0.4763
2.3153 90.0 675 1.6146 0.9934 0.4615
2.3153 90.93 682 1.6049 0.9934 0.4514
2.3153 92.0 690 1.5985 0.9934 0.4470
2.3153 92.93 697 1.5960 0.9934 0.4459
2.0162 93.33 700 1.5957 0.9934 0.4456

Framework versions