<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
xlsr-syntesized-turkish-8-hour-hlr
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1445
- Wer: 0.1089
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
6.1187 | 0.26 | 100 | 3.4197 | 1.0 |
3.1586 | 0.52 | 200 | 3.0503 | 1.0 |
2.3038 | 0.78 | 300 | 1.0263 | 0.8136 |
0.6749 | 1.04 | 400 | 0.3799 | 0.4198 |
0.4344 | 1.3 | 500 | 0.2965 | 0.3424 |
0.3568 | 1.56 | 600 | 0.2467 | 0.2779 |
0.3086 | 1.82 | 700 | 0.2358 | 0.2642 |
0.2715 | 2.08 | 800 | 0.1953 | 0.2363 |
0.2444 | 2.34 | 900 | 0.1857 | 0.2594 |
0.232 | 2.6 | 1000 | 0.1775 | 0.2223 |
0.2297 | 2.86 | 1100 | 0.1622 | 0.1979 |
0.1795 | 3.12 | 1200 | 0.1593 | 0.2007 |
0.1751 | 3.39 | 1300 | 0.1481 | 0.1823 |
0.1545 | 3.65 | 1400 | 0.1488 | 0.2003 |
0.165 | 3.91 | 1500 | 0.1350 | 0.1825 |
0.1438 | 4.17 | 1600 | 0.1293 | 0.1626 |
0.1194 | 4.43 | 1700 | 0.1240 | 0.1718 |
0.1291 | 4.69 | 1800 | 0.1220 | 0.1653 |
0.1184 | 4.95 | 1900 | 0.1217 | 0.1616 |
0.115 | 5.21 | 2000 | 0.1215 | 0.1443 |
0.0964 | 5.47 | 2100 | 0.1215 | 0.1463 |
0.0951 | 5.73 | 2200 | 0.1242 | 0.1478 |
0.1002 | 5.99 | 2300 | 0.1188 | 0.1533 |
0.0866 | 6.25 | 2400 | 0.1343 | 0.1550 |
0.0838 | 6.51 | 2500 | 0.1194 | 0.1428 |
0.0772 | 6.77 | 2600 | 0.1228 | 0.1478 |
0.0834 | 7.03 | 2700 | 0.1278 | 0.1493 |
0.0705 | 7.29 | 2800 | 0.1221 | 0.1578 |
0.0758 | 7.55 | 2900 | 0.1240 | 0.1409 |
0.0738 | 7.81 | 3000 | 0.1142 | 0.1415 |
0.0623 | 8.07 | 3100 | 0.1272 | 0.1307 |
0.0615 | 8.33 | 3200 | 0.1263 | 0.1292 |
0.0586 | 8.59 | 3300 | 0.1262 | 0.1317 |
0.066 | 8.85 | 3400 | 0.1370 | 0.1339 |
0.0558 | 9.11 | 3500 | 0.1226 | 0.1326 |
0.0546 | 9.38 | 3600 | 0.1308 | 0.1320 |
0.0582 | 9.64 | 3700 | 0.1311 | 0.1349 |
0.062 | 9.9 | 3800 | 0.1300 | 0.1252 |
0.0551 | 10.16 | 3900 | 0.1302 | 0.1261 |
0.0499 | 10.42 | 4000 | 0.1344 | 0.1240 |
0.046 | 10.68 | 4100 | 0.1219 | 0.1297 |
0.0513 | 10.94 | 4200 | 0.1292 | 0.1286 |
0.0481 | 11.2 | 4300 | 0.1339 | 0.1253 |
0.0458 | 11.46 | 4400 | 0.1243 | 0.1232 |
0.0458 | 11.72 | 4500 | 0.1352 | 0.1263 |
0.0469 | 11.98 | 4600 | 0.1308 | 0.1191 |
0.04 | 12.24 | 4700 | 0.1490 | 0.1229 |
0.0409 | 12.5 | 4800 | 0.1376 | 0.1221 |
0.0382 | 12.76 | 4900 | 0.1358 | 0.1225 |
0.0468 | 13.02 | 5000 | 0.1265 | 0.1224 |
0.0387 | 13.28 | 5100 | 0.1443 | 0.1178 |
0.0369 | 13.54 | 5200 | 0.1426 | 0.1178 |
0.0368 | 13.8 | 5300 | 0.1398 | 0.1209 |
0.0372 | 14.06 | 5400 | 0.1431 | 0.1186 |
0.0388 | 14.32 | 5500 | 0.1526 | 0.1187 |
0.034 | 14.58 | 5600 | 0.1355 | 0.1154 |
0.0339 | 14.84 | 5700 | 0.1469 | 0.1157 |
0.0332 | 15.1 | 5800 | 0.1411 | 0.1166 |
0.0345 | 15.36 | 5900 | 0.1469 | 0.1150 |
0.0356 | 15.62 | 6000 | 0.1460 | 0.1144 |
0.0341 | 15.89 | 6100 | 0.1362 | 0.1154 |
0.0327 | 16.15 | 6200 | 0.1462 | 0.1139 |
0.0292 | 16.41 | 6300 | 0.1457 | 0.1138 |
0.0296 | 16.67 | 6400 | 0.1450 | 0.1123 |
0.0312 | 16.93 | 6500 | 0.1385 | 0.1116 |
0.033 | 17.19 | 6600 | 0.1394 | 0.1099 |
0.0283 | 17.45 | 6700 | 0.1454 | 0.1105 |
0.0291 | 17.71 | 6800 | 0.1400 | 0.1095 |
0.0261 | 17.97 | 6900 | 0.1458 | 0.1107 |
0.0283 | 18.23 | 7000 | 0.1398 | 0.1089 |
0.0285 | 18.49 | 7100 | 0.1376 | 0.1095 |
0.0284 | 18.75 | 7200 | 0.1429 | 0.1092 |
0.0292 | 19.01 | 7300 | 0.1434 | 0.1076 |
0.0282 | 19.27 | 7400 | 0.1437 | 0.1079 |
0.0293 | 19.53 | 7500 | 0.1440 | 0.1084 |
0.0251 | 19.79 | 7600 | 0.1445 | 0.1089 |
Framework versions
- Transformers 4.26.0
- Pytorch 2.1.0+cu118
- Datasets 2.9.0
- Tokenizers 0.13.3