<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-clean-grade-2-5
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1823
- Wer: 0.1105
- Cer: 0.0287
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
27.7046 | 1.0 | 46 | 9.6857 | 1.0 | 0.9851 |
27.7046 | 2.0 | 92 | 8.2403 | 1.0 | 0.9800 |
12.8806 | 3.0 | 138 | 6.2490 | 1.0 | 0.9708 |
12.8806 | 4.0 | 184 | 3.1828 | 1.0 | 1.0 |
5.4641 | 5.0 | 230 | 2.9988 | 1.0 | 1.0 |
5.4641 | 6.0 | 276 | 2.9271 | 1.0 | 1.0 |
3.0003 | 7.0 | 322 | 2.9540 | 1.0 | 1.0 |
3.0003 | 8.0 | 368 | 2.9154 | 1.0 | 1.0 |
2.9292 | 9.0 | 414 | 2.9292 | 1.0 | 1.0 |
2.9292 | 10.0 | 460 | 2.8802 | 1.0 | 1.0 |
2.9025 | 11.0 | 506 | 2.8780 | 1.0 | 1.0 |
2.9025 | 12.0 | 552 | 2.8736 | 1.0 | 1.0 |
2.9025 | 13.0 | 598 | 2.8590 | 1.0 | 1.0 |
2.8855 | 14.0 | 644 | 2.8483 | 1.0 | 1.0 |
2.8855 | 15.0 | 690 | 2.6581 | 1.0 | 0.9989 |
2.8255 | 16.0 | 736 | 2.0909 | 1.0 | 0.8260 |
2.8255 | 17.0 | 782 | 1.1047 | 0.9856 | 0.2978 |
2.0443 | 18.0 | 828 | 0.6230 | 0.3884 | 0.0922 |
2.0443 | 19.0 | 874 | 0.4640 | 0.2273 | 0.0567 |
0.8938 | 20.0 | 920 | 0.3909 | 0.1956 | 0.0494 |
0.8938 | 21.0 | 966 | 0.3508 | 0.1755 | 0.0464 |
0.593 | 22.0 | 1012 | 0.3071 | 0.1726 | 0.0448 |
0.593 | 23.0 | 1058 | 0.3026 | 0.1674 | 0.0446 |
0.455 | 24.0 | 1104 | 0.2988 | 0.1542 | 0.0414 |
0.455 | 25.0 | 1150 | 0.2649 | 0.1484 | 0.0391 |
0.455 | 26.0 | 1196 | 0.2502 | 0.1467 | 0.0377 |
0.4093 | 27.0 | 1242 | 0.2554 | 0.1392 | 0.0369 |
0.4093 | 28.0 | 1288 | 0.2445 | 0.1381 | 0.0364 |
0.3667 | 29.0 | 1334 | 0.2461 | 0.1300 | 0.0354 |
0.3667 | 30.0 | 1380 | 0.2342 | 0.1231 | 0.0339 |
0.3219 | 31.0 | 1426 | 0.2284 | 0.1168 | 0.0328 |
0.3219 | 32.0 | 1472 | 0.2242 | 0.1203 | 0.0332 |
0.2755 | 33.0 | 1518 | 0.2212 | 0.1168 | 0.0326 |
0.2755 | 34.0 | 1564 | 0.2152 | 0.1197 | 0.0329 |
0.2829 | 35.0 | 1610 | 0.2135 | 0.1174 | 0.0322 |
0.2829 | 36.0 | 1656 | 0.2068 | 0.1151 | 0.0315 |
0.2585 | 37.0 | 1702 | 0.2080 | 0.1157 | 0.0313 |
0.2585 | 38.0 | 1748 | 0.2103 | 0.1151 | 0.0318 |
0.2585 | 39.0 | 1794 | 0.2065 | 0.1174 | 0.0315 |
0.2487 | 40.0 | 1840 | 0.2056 | 0.1157 | 0.0309 |
0.2487 | 41.0 | 1886 | 0.1991 | 0.1139 | 0.0308 |
0.2321 | 42.0 | 1932 | 0.1991 | 0.1116 | 0.0308 |
0.2321 | 43.0 | 1978 | 0.1970 | 0.1122 | 0.0303 |
0.2295 | 44.0 | 2024 | 0.1986 | 0.1087 | 0.0298 |
0.2295 | 45.0 | 2070 | 0.2024 | 0.1087 | 0.0297 |
0.205 | 46.0 | 2116 | 0.1994 | 0.1133 | 0.0303 |
0.205 | 47.0 | 2162 | 0.1964 | 0.1162 | 0.0305 |
0.2136 | 48.0 | 2208 | 0.1991 | 0.1110 | 0.0304 |
0.2136 | 49.0 | 2254 | 0.1933 | 0.1145 | 0.0301 |
0.1959 | 50.0 | 2300 | 0.1877 | 0.1116 | 0.0291 |
0.1959 | 51.0 | 2346 | 0.1907 | 0.1116 | 0.0310 |
0.1959 | 52.0 | 2392 | 0.1946 | 0.1157 | 0.0309 |
0.2001 | 53.0 | 2438 | 0.1957 | 0.1110 | 0.0297 |
0.2001 | 54.0 | 2484 | 0.1937 | 0.1105 | 0.0293 |
0.1884 | 55.0 | 2530 | 0.1892 | 0.1076 | 0.0293 |
0.1884 | 56.0 | 2576 | 0.1851 | 0.1116 | 0.0301 |
0.1914 | 57.0 | 2622 | 0.1823 | 0.1105 | 0.0287 |
0.1914 | 58.0 | 2668 | 0.1865 | 0.1082 | 0.0282 |
0.1668 | 59.0 | 2714 | 0.1825 | 0.1059 | 0.0283 |
0.1668 | 60.0 | 2760 | 0.1858 | 0.1053 | 0.0276 |
0.1825 | 61.0 | 2806 | 0.1860 | 0.1151 | 0.0292 |
0.1825 | 62.0 | 2852 | 0.1881 | 0.1093 | 0.0287 |
0.1825 | 63.0 | 2898 | 0.1885 | 0.1133 | 0.0306 |
0.1685 | 64.0 | 2944 | 0.1892 | 0.1128 | 0.0303 |
0.1685 | 65.0 | 2990 | 0.1900 | 0.1105 | 0.0304 |
0.166 | 66.0 | 3036 | 0.1883 | 0.1099 | 0.0296 |
0.166 | 67.0 | 3082 | 0.1895 | 0.1093 | 0.0301 |
0.1677 | 68.0 | 3128 | 0.1929 | 0.1070 | 0.0298 |
0.1677 | 69.0 | 3174 | 0.1947 | 0.1093 | 0.0298 |
0.1601 | 70.0 | 3220 | 0.1903 | 0.1053 | 0.0294 |
0.1601 | 71.0 | 3266 | 0.1914 | 0.1053 | 0.0288 |
0.1705 | 72.0 | 3312 | 0.1916 | 0.1099 | 0.0298 |
0.1705 | 73.0 | 3358 | 0.1940 | 0.1110 | 0.0302 |
0.1593 | 74.0 | 3404 | 0.1944 | 0.1099 | 0.0302 |
0.1593 | 75.0 | 3450 | 0.1916 | 0.1122 | 0.0303 |
0.1593 | 76.0 | 3496 | 0.1907 | 0.1122 | 0.0298 |
0.1479 | 77.0 | 3542 | 0.1907 | 0.1105 | 0.0294 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3