<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-clean-09
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1098
- Wer: 0.0866
- Cer: 0.0228
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
29.0965 | 1.0 | 67 | 8.2084 | 1.0 | 0.9866 |
12.5066 | 2.0 | 134 | 5.8509 | 1.0 | 0.9760 |
5.3711 | 3.0 | 201 | 3.0401 | 1.0 | 1.0 |
5.3711 | 4.0 | 268 | 2.9035 | 1.0 | 1.0 |
2.973 | 5.0 | 335 | 2.8834 | 1.0 | 1.0 |
2.9146 | 6.0 | 402 | 2.8791 | 1.0 | 1.0 |
2.9146 | 7.0 | 469 | 2.8697 | 1.0 | 1.0 |
2.8879 | 8.0 | 536 | 2.7420 | 1.0 | 1.0 |
2.7039 | 9.0 | 603 | 2.0292 | 0.9997 | 0.8203 |
2.7039 | 10.0 | 670 | 0.7833 | 0.9461 | 0.2105 |
1.5273 | 11.0 | 737 | 0.4157 | 0.2626 | 0.0603 |
0.744 | 12.0 | 804 | 0.2995 | 0.1683 | 0.0441 |
0.744 | 13.0 | 871 | 0.2560 | 0.1587 | 0.0422 |
0.5569 | 14.0 | 938 | 0.2261 | 0.1425 | 0.0381 |
0.462 | 15.0 | 1005 | 0.2143 | 0.1366 | 0.0372 |
0.462 | 16.0 | 1072 | 0.1977 | 0.1270 | 0.0339 |
0.3805 | 17.0 | 1139 | 0.1815 | 0.1227 | 0.0324 |
0.3617 | 18.0 | 1206 | 0.1712 | 0.1118 | 0.0300 |
0.3617 | 19.0 | 1273 | 0.1592 | 0.1095 | 0.0288 |
0.3152 | 20.0 | 1340 | 0.1549 | 0.1091 | 0.0277 |
0.2768 | 21.0 | 1407 | 0.1455 | 0.1015 | 0.0266 |
0.2768 | 22.0 | 1474 | 0.1485 | 0.0992 | 0.0262 |
0.2779 | 23.0 | 1541 | 0.1418 | 0.1048 | 0.0271 |
0.2605 | 24.0 | 1608 | 0.1376 | 0.0926 | 0.0249 |
0.2605 | 25.0 | 1675 | 0.1358 | 0.0995 | 0.0253 |
0.259 | 26.0 | 1742 | 0.1360 | 0.0956 | 0.0249 |
0.2669 | 27.0 | 1809 | 0.1362 | 0.0933 | 0.0251 |
0.2669 | 28.0 | 1876 | 0.1304 | 0.0956 | 0.0247 |
0.2454 | 29.0 | 1943 | 0.1289 | 0.0959 | 0.0248 |
0.2177 | 30.0 | 2010 | 0.1325 | 0.0959 | 0.0251 |
0.2177 | 31.0 | 2077 | 0.1260 | 0.0919 | 0.0242 |
0.2292 | 32.0 | 2144 | 0.1239 | 0.0919 | 0.0249 |
0.2058 | 33.0 | 2211 | 0.1259 | 0.0893 | 0.0243 |
0.2058 | 34.0 | 2278 | 0.1217 | 0.0853 | 0.0231 |
0.1865 | 35.0 | 2345 | 0.1214 | 0.0883 | 0.0236 |
0.204 | 36.0 | 2412 | 0.1195 | 0.0843 | 0.0237 |
0.204 | 37.0 | 2479 | 0.1204 | 0.0883 | 0.0243 |
0.1856 | 38.0 | 2546 | 0.1210 | 0.0886 | 0.0244 |
0.2094 | 39.0 | 2613 | 0.1188 | 0.0866 | 0.0239 |
0.2094 | 40.0 | 2680 | 0.1144 | 0.0860 | 0.0231 |
0.1772 | 41.0 | 2747 | 0.1137 | 0.0866 | 0.0235 |
0.1768 | 42.0 | 2814 | 0.1168 | 0.0886 | 0.0240 |
0.1768 | 43.0 | 2881 | 0.1169 | 0.0896 | 0.0240 |
0.1868 | 44.0 | 2948 | 0.1148 | 0.0896 | 0.0239 |
0.163 | 45.0 | 3015 | 0.1146 | 0.0896 | 0.0235 |
0.163 | 46.0 | 3082 | 0.1141 | 0.0896 | 0.0240 |
0.1661 | 47.0 | 3149 | 0.1170 | 0.0903 | 0.0239 |
0.1626 | 48.0 | 3216 | 0.1149 | 0.0856 | 0.0231 |
0.1626 | 49.0 | 3283 | 0.1156 | 0.0843 | 0.0231 |
0.1611 | 50.0 | 3350 | 0.1161 | 0.0863 | 0.0235 |
0.1707 | 51.0 | 3417 | 0.1140 | 0.0840 | 0.0228 |
0.1707 | 52.0 | 3484 | 0.1129 | 0.0863 | 0.0236 |
0.1571 | 53.0 | 3551 | 0.1117 | 0.0856 | 0.0228 |
0.142 | 54.0 | 3618 | 0.1103 | 0.0810 | 0.0222 |
0.142 | 55.0 | 3685 | 0.1098 | 0.0866 | 0.0228 |
0.1486 | 56.0 | 3752 | 0.1102 | 0.0813 | 0.0224 |
0.1464 | 57.0 | 3819 | 0.1112 | 0.0823 | 0.0224 |
0.1464 | 58.0 | 3886 | 0.1149 | 0.0820 | 0.0226 |
0.143 | 59.0 | 3953 | 0.1126 | 0.0823 | 0.0227 |
0.1533 | 60.0 | 4020 | 0.1125 | 0.0823 | 0.0224 |
0.1533 | 61.0 | 4087 | 0.1118 | 0.0800 | 0.0221 |
0.1547 | 62.0 | 4154 | 0.1131 | 0.0790 | 0.0219 |
0.1313 | 63.0 | 4221 | 0.1111 | 0.0800 | 0.0223 |
0.1313 | 64.0 | 4288 | 0.1111 | 0.0810 | 0.0223 |
0.1573 | 65.0 | 4355 | 0.1115 | 0.0807 | 0.0222 |
0.1358 | 66.0 | 4422 | 0.1107 | 0.0823 | 0.0222 |
0.1358 | 67.0 | 4489 | 0.1137 | 0.0820 | 0.0223 |
0.1386 | 68.0 | 4556 | 0.1135 | 0.0820 | 0.0225 |
0.1351 | 69.0 | 4623 | 0.1123 | 0.0810 | 0.0220 |
0.1351 | 70.0 | 4690 | 0.1121 | 0.0830 | 0.0225 |
0.1312 | 71.0 | 4757 | 0.1129 | 0.0837 | 0.0226 |
0.1295 | 72.0 | 4824 | 0.1118 | 0.0804 | 0.0224 |
0.1295 | 73.0 | 4891 | 0.1120 | 0.0797 | 0.0222 |
0.15 | 74.0 | 4958 | 0.1127 | 0.0810 | 0.0224 |
0.1291 | 75.0 | 5025 | 0.1146 | 0.0790 | 0.0220 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3