<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-mecita-coraa-portuguese-all-clean-02
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1158
- Wer: 0.0938
- Cer: 0.0252
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
25.9355 | 1.0 | 67 | 3.6433 | 1.0 | 1.0 |
9.9266 | 2.0 | 134 | 3.0004 | 1.0 | 1.0 |
3.0568 | 3.0 | 201 | 2.9226 | 1.0 | 1.0 |
3.0568 | 4.0 | 268 | 2.8955 | 1.0 | 1.0 |
2.9257 | 5.0 | 335 | 2.8838 | 1.0 | 1.0 |
2.8908 | 6.0 | 402 | 2.8047 | 1.0 | 1.0 |
2.8908 | 7.0 | 469 | 1.8146 | 1.0 | 0.5476 |
2.3697 | 8.0 | 536 | 0.6459 | 0.3310 | 0.0821 |
1.0965 | 9.0 | 603 | 0.4066 | 0.2195 | 0.0544 |
1.0965 | 10.0 | 670 | 0.3180 | 0.1903 | 0.0469 |
0.6221 | 11.0 | 737 | 0.2729 | 0.1719 | 0.0440 |
0.5212 | 12.0 | 804 | 0.2349 | 0.1584 | 0.0400 |
0.5212 | 13.0 | 871 | 0.2107 | 0.1389 | 0.0367 |
0.4267 | 14.0 | 938 | 0.2016 | 0.1365 | 0.0365 |
0.371 | 15.0 | 1005 | 0.1938 | 0.1289 | 0.0354 |
0.371 | 16.0 | 1072 | 0.1803 | 0.1205 | 0.0323 |
0.3495 | 17.0 | 1139 | 0.1746 | 0.1167 | 0.0314 |
0.3183 | 18.0 | 1206 | 0.1715 | 0.1105 | 0.0312 |
0.3183 | 19.0 | 1273 | 0.1586 | 0.1063 | 0.0297 |
0.2914 | 20.0 | 1340 | 0.1537 | 0.1014 | 0.0280 |
0.2909 | 21.0 | 1407 | 0.1475 | 0.1004 | 0.0280 |
0.2909 | 22.0 | 1474 | 0.1434 | 0.1056 | 0.0284 |
0.2614 | 23.0 | 1541 | 0.1428 | 0.1073 | 0.0285 |
0.2434 | 24.0 | 1608 | 0.1444 | 0.1025 | 0.0282 |
0.2434 | 25.0 | 1675 | 0.1413 | 0.1000 | 0.0275 |
0.2423 | 26.0 | 1742 | 0.1412 | 0.1000 | 0.0278 |
0.2297 | 27.0 | 1809 | 0.1385 | 0.0948 | 0.0258 |
0.2297 | 28.0 | 1876 | 0.1371 | 0.0966 | 0.0268 |
0.211 | 29.0 | 1943 | 0.1359 | 0.1000 | 0.0274 |
0.2101 | 30.0 | 2010 | 0.1310 | 0.0966 | 0.0260 |
0.2101 | 31.0 | 2077 | 0.1319 | 0.0986 | 0.0277 |
0.2004 | 32.0 | 2144 | 0.1292 | 0.0924 | 0.0256 |
0.2064 | 33.0 | 2211 | 0.1296 | 0.1000 | 0.0268 |
0.2064 | 34.0 | 2278 | 0.1313 | 0.0990 | 0.0266 |
0.1841 | 35.0 | 2345 | 0.1260 | 0.0952 | 0.0256 |
0.1855 | 36.0 | 2412 | 0.1265 | 0.0973 | 0.0263 |
0.1855 | 37.0 | 2479 | 0.1283 | 0.0959 | 0.0257 |
0.1786 | 38.0 | 2546 | 0.1265 | 0.0966 | 0.0265 |
0.1691 | 39.0 | 2613 | 0.1242 | 0.0893 | 0.0246 |
0.1691 | 40.0 | 2680 | 0.1264 | 0.0959 | 0.0262 |
0.1694 | 41.0 | 2747 | 0.1263 | 0.0945 | 0.0262 |
0.1654 | 42.0 | 2814 | 0.1263 | 0.0969 | 0.0261 |
0.1654 | 43.0 | 2881 | 0.1262 | 0.0948 | 0.0257 |
0.1624 | 44.0 | 2948 | 0.1234 | 0.0920 | 0.0247 |
0.1705 | 45.0 | 3015 | 0.1242 | 0.0966 | 0.0257 |
0.1705 | 46.0 | 3082 | 0.1224 | 0.0959 | 0.0252 |
0.1664 | 47.0 | 3149 | 0.1278 | 0.0962 | 0.0261 |
0.1603 | 48.0 | 3216 | 0.1231 | 0.0948 | 0.0258 |
0.1603 | 49.0 | 3283 | 0.1273 | 0.0955 | 0.0257 |
0.1448 | 50.0 | 3350 | 0.1224 | 0.0924 | 0.0248 |
0.1764 | 51.0 | 3417 | 0.1198 | 0.0945 | 0.0247 |
0.1764 | 52.0 | 3484 | 0.1227 | 0.0986 | 0.0260 |
0.1454 | 53.0 | 3551 | 0.1205 | 0.0952 | 0.0249 |
0.1539 | 54.0 | 3618 | 0.1203 | 0.0934 | 0.0245 |
0.1539 | 55.0 | 3685 | 0.1225 | 0.0973 | 0.0252 |
0.1532 | 56.0 | 3752 | 0.1214 | 0.0948 | 0.0255 |
0.142 | 57.0 | 3819 | 0.1194 | 0.0941 | 0.0253 |
0.142 | 58.0 | 3886 | 0.1200 | 0.0969 | 0.0250 |
0.1512 | 59.0 | 3953 | 0.1178 | 0.0917 | 0.0249 |
0.1276 | 60.0 | 4020 | 0.1158 | 0.0938 | 0.0252 |
0.1276 | 61.0 | 4087 | 0.1189 | 0.0896 | 0.0249 |
0.1436 | 62.0 | 4154 | 0.1223 | 0.0931 | 0.0254 |
0.1418 | 63.0 | 4221 | 0.1216 | 0.0938 | 0.0254 |
0.1418 | 64.0 | 4288 | 0.1205 | 0.0917 | 0.0252 |
0.1449 | 65.0 | 4355 | 0.1222 | 0.0900 | 0.0247 |
0.1392 | 66.0 | 4422 | 0.1215 | 0.0955 | 0.0256 |
0.1392 | 67.0 | 4489 | 0.1222 | 0.0955 | 0.0255 |
0.1293 | 68.0 | 4556 | 0.1204 | 0.0934 | 0.0254 |
0.1329 | 69.0 | 4623 | 0.1259 | 0.0924 | 0.0250 |
0.1329 | 70.0 | 4690 | 0.1274 | 0.0952 | 0.0256 |
0.1264 | 71.0 | 4757 | 0.1247 | 0.0903 | 0.0252 |
0.1396 | 72.0 | 4824 | 0.1205 | 0.0924 | 0.0252 |
0.1396 | 73.0 | 4891 | 0.1202 | 0.0945 | 0.0252 |
0.1399 | 74.0 | 4958 | 0.1192 | 0.0938 | 0.0252 |
0.1285 | 75.0 | 5025 | 0.1194 | 0.0962 | 0.0254 |
0.1285 | 76.0 | 5092 | 0.1193 | 0.0945 | 0.0254 |
0.1245 | 77.0 | 5159 | 0.1186 | 0.0910 | 0.0246 |
0.1254 | 78.0 | 5226 | 0.1181 | 0.0917 | 0.0245 |
0.1254 | 79.0 | 5293 | 0.1187 | 0.0903 | 0.0241 |
0.1334 | 80.0 | 5360 | 0.1189 | 0.0917 | 0.0247 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3