<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-grade-2-4
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2083
- Wer: 0.0980
- Cer: 0.0308
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
36.4021 | 0.99 | 47 | 4.1689 | 1.0 | 1.0 |
36.4021 | 2.0 | 95 | 3.3341 | 1.0 | 1.0 |
9.7815 | 2.99 | 142 | 3.0767 | 1.0 | 1.0 |
9.7815 | 4.0 | 190 | 2.9778 | 1.0 | 1.0 |
3.0911 | 4.99 | 237 | 2.9770 | 1.0 | 1.0 |
3.0911 | 6.0 | 285 | 2.9323 | 1.0 | 1.0 |
2.9632 | 6.99 | 332 | 2.9243 | 1.0 | 1.0 |
2.9632 | 8.0 | 380 | 2.9289 | 1.0 | 1.0 |
2.9215 | 8.99 | 427 | 2.9401 | 1.0 | 1.0 |
2.9215 | 10.0 | 475 | 2.9144 | 1.0 | 1.0 |
2.9094 | 10.99 | 522 | 2.8511 | 1.0 | 1.0 |
2.9094 | 12.0 | 570 | 2.5570 | 1.0 | 1.0 |
2.7275 | 12.99 | 617 | 1.7629 | 1.0 | 0.5226 |
2.7275 | 14.0 | 665 | 0.9919 | 0.9607 | 0.2330 |
1.6023 | 14.99 | 712 | 0.6021 | 0.3156 | 0.0820 |
1.6023 | 16.0 | 760 | 0.4921 | 0.2272 | 0.0629 |
0.8405 | 16.99 | 807 | 0.4294 | 0.2019 | 0.0567 |
0.8405 | 18.0 | 855 | 0.3899 | 0.1836 | 0.0508 |
0.6089 | 18.99 | 902 | 0.3706 | 0.1826 | 0.0519 |
0.6089 | 20.0 | 950 | 0.3401 | 0.1675 | 0.0477 |
0.6089 | 20.99 | 997 | 0.3238 | 0.1686 | 0.0483 |
0.5179 | 22.0 | 1045 | 0.3057 | 0.1594 | 0.0465 |
0.5179 | 22.99 | 1092 | 0.2963 | 0.1513 | 0.0445 |
0.459 | 24.0 | 1140 | 0.2879 | 0.1486 | 0.0431 |
0.459 | 24.99 | 1187 | 0.2742 | 0.1379 | 0.0402 |
0.3977 | 26.0 | 1235 | 0.2674 | 0.1346 | 0.0397 |
0.3977 | 26.99 | 1282 | 0.2697 | 0.1357 | 0.0405 |
0.3364 | 28.0 | 1330 | 0.2657 | 0.1319 | 0.0387 |
0.3364 | 28.99 | 1377 | 0.2529 | 0.1287 | 0.0376 |
0.3128 | 30.0 | 1425 | 0.2472 | 0.1239 | 0.0366 |
0.3128 | 30.99 | 1472 | 0.2476 | 0.1244 | 0.0372 |
0.2828 | 32.0 | 1520 | 0.2490 | 0.1265 | 0.0377 |
0.2828 | 32.99 | 1567 | 0.2406 | 0.1206 | 0.0361 |
0.2832 | 34.0 | 1615 | 0.2442 | 0.1212 | 0.0367 |
0.2832 | 34.99 | 1662 | 0.2406 | 0.1249 | 0.0370 |
0.2588 | 36.0 | 1710 | 0.2421 | 0.1174 | 0.0361 |
0.2588 | 36.99 | 1757 | 0.2427 | 0.1136 | 0.0350 |
0.2589 | 38.0 | 1805 | 0.2414 | 0.1136 | 0.0354 |
0.2589 | 38.99 | 1852 | 0.2352 | 0.1131 | 0.0349 |
0.236 | 40.0 | 1900 | 0.2368 | 0.1152 | 0.0353 |
0.236 | 40.99 | 1947 | 0.2370 | 0.1109 | 0.0345 |
0.236 | 42.0 | 1995 | 0.2330 | 0.1109 | 0.0344 |
0.2387 | 42.99 | 2042 | 0.2264 | 0.1082 | 0.0337 |
0.2387 | 44.0 | 2090 | 0.2259 | 0.1120 | 0.0355 |
0.2188 | 44.99 | 2137 | 0.2264 | 0.1109 | 0.0349 |
0.2188 | 46.0 | 2185 | 0.2298 | 0.1120 | 0.0353 |
0.2239 | 46.99 | 2232 | 0.2240 | 0.1088 | 0.0345 |
0.2239 | 48.0 | 2280 | 0.2266 | 0.1061 | 0.0336 |
0.2214 | 48.99 | 2327 | 0.2237 | 0.1104 | 0.0343 |
0.2214 | 50.0 | 2375 | 0.2261 | 0.1099 | 0.0345 |
0.192 | 50.99 | 2422 | 0.2167 | 0.1034 | 0.0325 |
0.192 | 52.0 | 2470 | 0.2246 | 0.1077 | 0.0332 |
0.1946 | 52.99 | 2517 | 0.2212 | 0.1082 | 0.0336 |
0.1946 | 54.0 | 2565 | 0.2206 | 0.1045 | 0.0330 |
0.2046 | 54.99 | 2612 | 0.2178 | 0.1050 | 0.0326 |
0.2046 | 56.0 | 2660 | 0.2244 | 0.1045 | 0.0329 |
0.2051 | 56.99 | 2707 | 0.2196 | 0.1072 | 0.0333 |
0.2051 | 58.0 | 2755 | 0.2200 | 0.1045 | 0.0325 |
0.1786 | 58.99 | 2802 | 0.2256 | 0.1066 | 0.0331 |
0.1786 | 60.0 | 2850 | 0.2198 | 0.1072 | 0.0329 |
0.1786 | 60.99 | 2897 | 0.2179 | 0.1045 | 0.0325 |
0.174 | 62.0 | 2945 | 0.2180 | 0.1050 | 0.0329 |
0.174 | 62.99 | 2992 | 0.2166 | 0.1039 | 0.0330 |
0.1878 | 64.0 | 3040 | 0.2142 | 0.1039 | 0.0335 |
0.1878 | 64.99 | 3087 | 0.2149 | 0.1055 | 0.0330 |
0.1841 | 66.0 | 3135 | 0.2149 | 0.1012 | 0.0332 |
0.1841 | 66.99 | 3182 | 0.2153 | 0.1029 | 0.0326 |
0.1844 | 68.0 | 3230 | 0.2192 | 0.1002 | 0.0320 |
0.1844 | 68.99 | 3277 | 0.2214 | 0.1029 | 0.0327 |
0.1569 | 70.0 | 3325 | 0.2162 | 0.1034 | 0.0326 |
0.1569 | 70.99 | 3372 | 0.2160 | 0.1045 | 0.0325 |
0.1483 | 72.0 | 3420 | 0.2165 | 0.1034 | 0.0326 |
0.1483 | 72.99 | 3467 | 0.2148 | 0.1061 | 0.0328 |
0.1553 | 74.0 | 3515 | 0.2119 | 0.1039 | 0.0317 |
0.1553 | 74.99 | 3562 | 0.2102 | 0.0985 | 0.0307 |
0.1745 | 76.0 | 3610 | 0.2146 | 0.1023 | 0.0314 |
0.1745 | 76.99 | 3657 | 0.2103 | 0.1012 | 0.0312 |
0.1633 | 78.0 | 3705 | 0.2141 | 0.1012 | 0.0307 |
0.1633 | 78.99 | 3752 | 0.2121 | 0.0969 | 0.0303 |
0.1567 | 80.0 | 3800 | 0.2115 | 0.1007 | 0.0312 |
0.1567 | 80.99 | 3847 | 0.2113 | 0.0996 | 0.0312 |
0.1567 | 82.0 | 3895 | 0.2104 | 0.0996 | 0.0313 |
0.1707 | 82.99 | 3942 | 0.2111 | 0.0996 | 0.0312 |
0.1707 | 84.0 | 3990 | 0.2083 | 0.0980 | 0.0308 |
0.1683 | 84.99 | 4037 | 0.2093 | 0.0991 | 0.0314 |
0.1683 | 86.0 | 4085 | 0.2116 | 0.0996 | 0.0317 |
0.1701 | 86.99 | 4132 | 0.2097 | 0.1002 | 0.0310 |
0.1701 | 88.0 | 4180 | 0.2106 | 0.1002 | 0.0311 |
0.1518 | 88.99 | 4227 | 0.2112 | 0.0991 | 0.0309 |
0.1518 | 90.0 | 4275 | 0.2105 | 0.0980 | 0.0307 |
0.1594 | 90.99 | 4322 | 0.2106 | 0.0980 | 0.0308 |
0.1594 | 92.0 | 4370 | 0.2092 | 0.0980 | 0.0305 |
0.1387 | 92.99 | 4417 | 0.2103 | 0.0991 | 0.0313 |
0.1387 | 94.0 | 4465 | 0.2109 | 0.0996 | 0.0311 |
0.1556 | 94.99 | 4512 | 0.2108 | 0.0980 | 0.0310 |
0.1556 | 96.0 | 4560 | 0.2109 | 0.0991 | 0.0311 |
0.1623 | 96.99 | 4607 | 0.2111 | 0.0975 | 0.0310 |
0.1623 | 98.0 | 4655 | 0.2112 | 0.0985 | 0.0312 |
0.1562 | 98.95 | 4700 | 0.2111 | 0.0985 | 0.0312 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3