<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-clean-08
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1390
- Wer: 0.0887
- Cer: 0.0236
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
24.6865 | 1.0 | 67 | 3.4264 | 1.0 | 1.0 |
7.5805 | 2.0 | 134 | 2.9809 | 1.0 | 1.0 |
3.0234 | 3.0 | 201 | 2.9417 | 1.0 | 1.0 |
3.0234 | 4.0 | 268 | 2.8992 | 1.0 | 1.0 |
2.9257 | 5.0 | 335 | 2.8742 | 1.0 | 1.0 |
2.8778 | 6.0 | 402 | 2.7083 | 1.0 | 1.0 |
2.8778 | 7.0 | 469 | 1.3587 | 1.0 | 0.3296 |
2.1565 | 8.0 | 536 | 0.5632 | 0.2596 | 0.0630 |
0.937 | 9.0 | 603 | 0.4067 | 0.1967 | 0.0506 |
0.937 | 10.0 | 670 | 0.3297 | 0.1791 | 0.0460 |
0.602 | 11.0 | 737 | 0.2770 | 0.1618 | 0.0412 |
0.4956 | 12.0 | 804 | 0.2457 | 0.1496 | 0.0384 |
0.4956 | 13.0 | 871 | 0.2290 | 0.1369 | 0.0356 |
0.4144 | 14.0 | 938 | 0.2095 | 0.1273 | 0.0336 |
0.3712 | 15.0 | 1005 | 0.2069 | 0.1210 | 0.0320 |
0.3712 | 16.0 | 1072 | 0.1987 | 0.1163 | 0.0305 |
0.3271 | 17.0 | 1139 | 0.1910 | 0.1140 | 0.0303 |
0.3092 | 18.0 | 1206 | 0.1860 | 0.1103 | 0.0289 |
0.3092 | 19.0 | 1273 | 0.1748 | 0.1024 | 0.0281 |
0.2977 | 20.0 | 1340 | 0.1702 | 0.1034 | 0.0279 |
0.2865 | 21.0 | 1407 | 0.1631 | 0.1024 | 0.0274 |
0.2865 | 22.0 | 1474 | 0.1664 | 0.1004 | 0.0269 |
0.2463 | 23.0 | 1541 | 0.1617 | 0.1014 | 0.0276 |
0.2447 | 24.0 | 1608 | 0.1594 | 0.0967 | 0.0264 |
0.2447 | 25.0 | 1675 | 0.1516 | 0.0984 | 0.0262 |
0.2323 | 26.0 | 1742 | 0.1555 | 0.0974 | 0.0264 |
0.2229 | 27.0 | 1809 | 0.1487 | 0.0964 | 0.0261 |
0.2229 | 28.0 | 1876 | 0.1499 | 0.0967 | 0.0263 |
0.2015 | 29.0 | 1943 | 0.1474 | 0.1000 | 0.0270 |
0.1986 | 30.0 | 2010 | 0.1483 | 0.0917 | 0.0256 |
0.1986 | 31.0 | 2077 | 0.1473 | 0.0957 | 0.0259 |
0.1958 | 32.0 | 2144 | 0.1469 | 0.0954 | 0.0260 |
0.2034 | 33.0 | 2211 | 0.1475 | 0.0927 | 0.0251 |
0.2034 | 34.0 | 2278 | 0.1490 | 0.0931 | 0.0256 |
0.1947 | 35.0 | 2345 | 0.1483 | 0.0917 | 0.0249 |
0.1961 | 36.0 | 2412 | 0.1436 | 0.0924 | 0.0249 |
0.1961 | 37.0 | 2479 | 0.1479 | 0.0924 | 0.0252 |
0.181 | 38.0 | 2546 | 0.1442 | 0.0964 | 0.0253 |
0.1732 | 39.0 | 2613 | 0.1456 | 0.0937 | 0.0249 |
0.1732 | 40.0 | 2680 | 0.1427 | 0.0914 | 0.0249 |
0.174 | 41.0 | 2747 | 0.1448 | 0.0944 | 0.0250 |
0.1694 | 42.0 | 2814 | 0.1433 | 0.0911 | 0.0241 |
0.1694 | 43.0 | 2881 | 0.1421 | 0.0877 | 0.0233 |
0.1469 | 44.0 | 2948 | 0.1432 | 0.0907 | 0.0241 |
0.1665 | 45.0 | 3015 | 0.1410 | 0.0964 | 0.0250 |
0.1665 | 46.0 | 3082 | 0.1423 | 0.0924 | 0.0251 |
0.1662 | 47.0 | 3149 | 0.1422 | 0.0897 | 0.0239 |
0.154 | 48.0 | 3216 | 0.1408 | 0.0931 | 0.0245 |
0.154 | 49.0 | 3283 | 0.1451 | 0.0904 | 0.0242 |
0.1492 | 50.0 | 3350 | 0.1395 | 0.0911 | 0.0242 |
0.1532 | 51.0 | 3417 | 0.1390 | 0.0887 | 0.0236 |
0.1532 | 52.0 | 3484 | 0.1442 | 0.0941 | 0.0243 |
0.1442 | 53.0 | 3551 | 0.1429 | 0.0941 | 0.0241 |
0.1608 | 54.0 | 3618 | 0.1396 | 0.0897 | 0.0239 |
0.1608 | 55.0 | 3685 | 0.1427 | 0.0917 | 0.0240 |
0.1532 | 56.0 | 3752 | 0.1455 | 0.0897 | 0.0238 |
0.1376 | 57.0 | 3819 | 0.1425 | 0.0901 | 0.0239 |
0.1376 | 58.0 | 3886 | 0.1466 | 0.0891 | 0.0240 |
0.1327 | 59.0 | 3953 | 0.1422 | 0.0844 | 0.0232 |
0.1435 | 60.0 | 4020 | 0.1454 | 0.0901 | 0.0236 |
0.1435 | 61.0 | 4087 | 0.1399 | 0.0867 | 0.0234 |
0.1344 | 62.0 | 4154 | 0.1396 | 0.0874 | 0.0235 |
0.1382 | 63.0 | 4221 | 0.1403 | 0.0867 | 0.0232 |
0.1382 | 64.0 | 4288 | 0.1443 | 0.0871 | 0.0231 |
0.1303 | 65.0 | 4355 | 0.1413 | 0.0841 | 0.0225 |
0.1369 | 66.0 | 4422 | 0.1449 | 0.0877 | 0.0233 |
0.1369 | 67.0 | 4489 | 0.1444 | 0.0884 | 0.0231 |
0.1352 | 68.0 | 4556 | 0.1416 | 0.0887 | 0.0236 |
0.1493 | 69.0 | 4623 | 0.1437 | 0.0907 | 0.0247 |
0.1493 | 70.0 | 4690 | 0.1494 | 0.0907 | 0.0239 |
0.1186 | 71.0 | 4757 | 0.1460 | 0.0897 | 0.0239 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3