<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-clean-grade-4
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.0559
- Wer: 0.8041
- Cer: 0.1817
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
13.5422 | 0.92 | 6 | 11.6661 | 0.9971 | 0.9571 |
13.5422 | 2.0 | 13 | 9.0694 | 1.0 | 1.0 |
13.5422 | 2.92 | 19 | 7.1705 | 1.0 | 1.0 |
13.5422 | 4.0 | 26 | 5.0906 | 1.0 | 1.0 |
13.5422 | 4.92 | 32 | 3.9518 | 1.0 | 1.0 |
13.5422 | 6.0 | 39 | 3.5443 | 1.0 | 1.0 |
13.5422 | 6.92 | 45 | 3.4079 | 1.0 | 1.0 |
13.5422 | 8.0 | 52 | 3.2689 | 1.0 | 1.0 |
13.5422 | 8.92 | 58 | 3.1613 | 1.0 | 1.0 |
13.5422 | 10.0 | 65 | 3.0990 | 1.0 | 1.0 |
13.5422 | 10.92 | 71 | 3.0482 | 1.0 | 1.0 |
13.5422 | 12.0 | 78 | 2.9975 | 1.0 | 1.0 |
13.5422 | 12.92 | 84 | 2.9798 | 1.0 | 1.0 |
13.5422 | 14.0 | 91 | 2.9568 | 1.0 | 1.0 |
13.5422 | 14.92 | 97 | 2.9349 | 1.0 | 1.0 |
5.0252 | 16.0 | 104 | 2.9166 | 1.0 | 1.0 |
5.0252 | 16.92 | 110 | 2.8980 | 1.0 | 1.0 |
5.0252 | 18.0 | 117 | 2.8853 | 1.0 | 1.0 |
5.0252 | 18.92 | 123 | 2.8815 | 1.0 | 1.0 |
5.0252 | 20.0 | 130 | 2.8608 | 1.0 | 1.0 |
5.0252 | 20.92 | 136 | 2.8515 | 1.0 | 1.0 |
5.0252 | 22.0 | 143 | 2.8384 | 1.0 | 1.0 |
5.0252 | 22.92 | 149 | 2.8432 | 1.0 | 1.0 |
5.0252 | 24.0 | 156 | 2.8300 | 1.0 | 1.0 |
5.0252 | 24.92 | 162 | 2.8256 | 1.0 | 1.0 |
5.0252 | 26.0 | 169 | 2.8224 | 1.0 | 1.0 |
5.0252 | 26.92 | 175 | 2.8333 | 1.0 | 1.0 |
5.0252 | 28.0 | 182 | 2.8154 | 1.0 | 1.0 |
5.0252 | 28.92 | 188 | 2.8164 | 1.0 | 1.0 |
5.0252 | 30.0 | 195 | 2.8178 | 1.0 | 1.0 |
2.8912 | 30.92 | 201 | 2.8093 | 1.0 | 1.0 |
2.8912 | 32.0 | 208 | 2.8101 | 1.0 | 1.0 |
2.8912 | 32.92 | 214 | 2.8043 | 1.0 | 1.0 |
2.8912 | 34.0 | 221 | 2.8058 | 1.0 | 1.0 |
2.8912 | 34.92 | 227 | 2.8008 | 1.0 | 1.0 |
2.8912 | 36.0 | 234 | 2.7968 | 1.0 | 1.0 |
2.8912 | 36.92 | 240 | 2.8047 | 1.0 | 1.0 |
2.8912 | 38.0 | 247 | 2.8005 | 1.0 | 1.0 |
2.8912 | 38.92 | 253 | 2.7978 | 1.0 | 1.0 |
2.8912 | 40.0 | 260 | 2.8056 | 1.0 | 1.0 |
2.8912 | 40.92 | 266 | 2.7929 | 1.0 | 1.0 |
2.8912 | 42.0 | 273 | 2.7819 | 1.0 | 1.0 |
2.8912 | 42.92 | 279 | 2.7817 | 1.0 | 1.0 |
2.8912 | 44.0 | 286 | 2.7840 | 1.0 | 1.0 |
2.8912 | 44.92 | 292 | 2.7610 | 1.0 | 1.0 |
2.8912 | 46.0 | 299 | 2.7490 | 1.0 | 1.0 |
2.8224 | 46.92 | 305 | 2.7385 | 1.0 | 1.0 |
2.8224 | 48.0 | 312 | 2.7082 | 1.0 | 1.0 |
2.8224 | 48.92 | 318 | 2.7051 | 1.0 | 1.0 |
2.8224 | 50.0 | 325 | 2.6650 | 1.0 | 1.0 |
2.8224 | 50.92 | 331 | 2.6570 | 1.0 | 1.0 |
2.8224 | 52.0 | 338 | 2.6118 | 1.0 | 1.0 |
2.8224 | 52.92 | 344 | 2.5891 | 1.0 | 1.0 |
2.8224 | 54.0 | 351 | 2.5418 | 1.0 | 1.0 |
2.8224 | 54.92 | 357 | 2.5060 | 1.0 | 1.0 |
2.8224 | 56.0 | 364 | 2.4531 | 1.0 | 1.0 |
2.8224 | 56.92 | 370 | 2.4133 | 1.0 | 1.0 |
2.8224 | 58.0 | 377 | 2.3590 | 1.0 | 1.0 |
2.8224 | 58.92 | 383 | 2.3121 | 0.9971 | 0.9976 |
2.8224 | 60.0 | 390 | 2.2566 | 0.9971 | 0.9908 |
2.8224 | 60.92 | 396 | 2.1933 | 0.9971 | 0.9590 |
2.6427 | 62.0 | 403 | 2.1499 | 0.9971 | 0.9634 |
2.6427 | 62.92 | 409 | 2.0772 | 1.0 | 0.8969 |
2.6427 | 64.0 | 416 | 2.0133 | 1.0 | 0.8173 |
2.6427 | 64.92 | 422 | 1.9718 | 1.0 | 0.7889 |
2.6427 | 66.0 | 429 | 1.9048 | 1.0 | 0.6973 |
2.6427 | 66.92 | 435 | 1.8351 | 1.0 | 0.6024 |
2.6427 | 68.0 | 442 | 1.7699 | 1.0 | 0.5928 |
2.6427 | 68.92 | 448 | 1.6969 | 1.0 | 0.5359 |
2.6427 | 70.0 | 455 | 1.6229 | 0.9971 | 0.4853 |
2.6427 | 70.92 | 461 | 1.5610 | 0.9971 | 0.4506 |
2.6427 | 72.0 | 468 | 1.4963 | 0.9971 | 0.3981 |
2.6427 | 72.92 | 474 | 1.4463 | 0.9942 | 0.3643 |
2.6427 | 74.0 | 481 | 1.4096 | 0.9942 | 0.3533 |
2.6427 | 74.92 | 487 | 1.3642 | 0.9854 | 0.3157 |
2.6427 | 76.0 | 494 | 1.3182 | 0.9795 | 0.2887 |
2.0683 | 76.92 | 500 | 1.2844 | 0.9766 | 0.2810 |
2.0683 | 78.0 | 507 | 1.2517 | 0.9678 | 0.2670 |
2.0683 | 78.92 | 513 | 1.2261 | 0.9620 | 0.2540 |
2.0683 | 80.0 | 520 | 1.2069 | 0.9649 | 0.2617 |
2.0683 | 80.92 | 526 | 1.1887 | 0.9620 | 0.2540 |
2.0683 | 82.0 | 533 | 1.1584 | 0.9327 | 0.2284 |
2.0683 | 82.92 | 539 | 1.1375 | 0.8947 | 0.2067 |
2.0683 | 84.0 | 546 | 1.1175 | 0.8713 | 0.2029 |
2.0683 | 84.92 | 552 | 1.1044 | 0.8655 | 0.2014 |
2.0683 | 86.0 | 559 | 1.0931 | 0.8567 | 0.1990 |
2.0683 | 86.92 | 565 | 1.0834 | 0.8480 | 0.1913 |
2.0683 | 88.0 | 572 | 1.0744 | 0.8275 | 0.1875 |
2.0683 | 88.92 | 578 | 1.0676 | 0.8246 | 0.1860 |
2.0683 | 90.0 | 585 | 1.0619 | 0.8129 | 0.1846 |
2.0683 | 90.92 | 591 | 1.0587 | 0.8129 | 0.1822 |
2.0683 | 92.0 | 598 | 1.0562 | 0.8041 | 0.1822 |
1.6124 | 92.31 | 600 | 1.0559 | 0.8041 | 0.1817 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3