<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mec-ita-coraa-portuguese-all-08
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1361
- Wer: 0.0908
- Cer: 0.0262
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
30.8333 | 1.0 | 86 | 5.9290 | 0.9691 | 0.9848 |
10.3431 | 2.0 | 172 | 4.4413 | 0.9597 | 0.9416 |
5.5005 | 3.0 | 258 | 4.1216 | 0.9629 | 0.9686 |
5.1445 | 4.0 | 344 | 3.9034 | 0.9775 | 0.9794 |
4.7755 | 5.0 | 430 | 3.8820 | 0.9753 | 0.9808 |
4.6091 | 6.0 | 516 | 3.4899 | 0.9794 | 0.9789 |
3.4089 | 7.0 | 602 | 2.9060 | 1.0 | 1.0 |
3.4089 | 8.0 | 688 | 2.8936 | 1.0 | 1.0 |
2.9149 | 9.0 | 774 | 2.8915 | 1.0 | 1.0 |
2.9135 | 10.0 | 860 | 2.8677 | 1.0 | 1.0 |
2.8889 | 11.0 | 946 | 2.8339 | 0.9969 | 0.9609 |
2.8245 | 12.0 | 1032 | 2.5122 | 0.9998 | 0.8967 |
2.5211 | 13.0 | 1118 | 1.4597 | 0.9662 | 0.4097 |
1.4387 | 14.0 | 1204 | 0.5385 | 0.3741 | 0.0873 |
1.4387 | 15.0 | 1290 | 0.3532 | 0.2106 | 0.0554 |
0.771 | 16.0 | 1376 | 0.2908 | 0.1836 | 0.0486 |
0.5996 | 17.0 | 1462 | 0.2460 | 0.1584 | 0.0423 |
0.4877 | 18.0 | 1548 | 0.2211 | 0.1486 | 0.0392 |
0.4156 | 19.0 | 1634 | 0.2024 | 0.1253 | 0.0352 |
0.4011 | 20.0 | 1720 | 0.1971 | 0.1347 | 0.0364 |
0.3624 | 21.0 | 1806 | 0.1899 | 0.1239 | 0.0346 |
0.3624 | 22.0 | 1892 | 0.1771 | 0.1241 | 0.0334 |
0.3319 | 23.0 | 1978 | 0.1672 | 0.1141 | 0.0320 |
0.3036 | 24.0 | 2064 | 0.1629 | 0.1172 | 0.0320 |
0.3024 | 25.0 | 2150 | 0.1572 | 0.1066 | 0.0299 |
0.2631 | 26.0 | 2236 | 0.1538 | 0.1090 | 0.0297 |
0.2668 | 27.0 | 2322 | 0.1499 | 0.1066 | 0.0297 |
0.2563 | 28.0 | 2408 | 0.1495 | 0.1018 | 0.0288 |
0.2563 | 29.0 | 2494 | 0.1501 | 0.1069 | 0.0300 |
0.2468 | 30.0 | 2580 | 0.1473 | 0.1071 | 0.0295 |
0.2462 | 31.0 | 2666 | 0.1433 | 0.1009 | 0.0280 |
0.2211 | 32.0 | 2752 | 0.1455 | 0.1026 | 0.0288 |
0.2256 | 33.0 | 2838 | 0.1449 | 0.1021 | 0.0289 |
0.2392 | 34.0 | 2924 | 0.1454 | 0.1040 | 0.0291 |
0.217 | 35.0 | 3010 | 0.1468 | 0.1011 | 0.0288 |
0.217 | 36.0 | 3096 | 0.1450 | 0.1059 | 0.0295 |
0.2129 | 37.0 | 3182 | 0.1419 | 0.1006 | 0.0286 |
0.1954 | 38.0 | 3268 | 0.1435 | 0.1002 | 0.0283 |
0.2093 | 39.0 | 3354 | 0.1439 | 0.0973 | 0.0283 |
0.1917 | 40.0 | 3440 | 0.1395 | 0.0959 | 0.0276 |
0.1878 | 41.0 | 3526 | 0.1379 | 0.0966 | 0.0269 |
0.1828 | 42.0 | 3612 | 0.1391 | 0.0906 | 0.0267 |
0.1828 | 43.0 | 3698 | 0.1378 | 0.0947 | 0.0268 |
0.1883 | 44.0 | 3784 | 0.1378 | 0.0925 | 0.0261 |
0.1627 | 45.0 | 3870 | 0.1446 | 0.0939 | 0.0272 |
0.1808 | 46.0 | 3956 | 0.1400 | 0.0942 | 0.0268 |
0.1759 | 47.0 | 4042 | 0.1389 | 0.0923 | 0.0263 |
0.165 | 48.0 | 4128 | 0.1361 | 0.0908 | 0.0262 |
0.1672 | 49.0 | 4214 | 0.1413 | 0.0932 | 0.0267 |
0.1953 | 50.0 | 4300 | 0.1409 | 0.0911 | 0.0263 |
0.1953 | 51.0 | 4386 | 0.1413 | 0.0959 | 0.0263 |
0.1733 | 52.0 | 4472 | 0.1416 | 0.0913 | 0.0263 |
0.1542 | 53.0 | 4558 | 0.1403 | 0.0906 | 0.0262 |
0.1606 | 54.0 | 4644 | 0.1404 | 0.0915 | 0.0265 |
0.1565 | 55.0 | 4730 | 0.1377 | 0.0944 | 0.0264 |
0.1472 | 56.0 | 4816 | 0.1401 | 0.0951 | 0.0266 |
0.1619 | 57.0 | 4902 | 0.1392 | 0.0918 | 0.0256 |
0.1619 | 58.0 | 4988 | 0.1381 | 0.0908 | 0.0257 |
0.1526 | 59.0 | 5074 | 0.1421 | 0.0923 | 0.0263 |
0.1557 | 60.0 | 5160 | 0.1398 | 0.0887 | 0.0257 |
0.1519 | 61.0 | 5246 | 0.1401 | 0.0891 | 0.0255 |
0.1475 | 62.0 | 5332 | 0.1414 | 0.0903 | 0.0255 |
0.1388 | 63.0 | 5418 | 0.1384 | 0.0896 | 0.0257 |
0.147 | 64.0 | 5504 | 0.1385 | 0.0939 | 0.0262 |
0.147 | 65.0 | 5590 | 0.1394 | 0.0913 | 0.0264 |
0.1371 | 66.0 | 5676 | 0.1412 | 0.0906 | 0.0262 |
0.1428 | 67.0 | 5762 | 0.1405 | 0.0899 | 0.0261 |
0.1344 | 68.0 | 5848 | 0.1413 | 0.0925 | 0.0264 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3