<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-burak-new-300-v2-4
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3402
- Wer: 0.2237
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 131
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
7.7711 | 2.45 | 500 | 3.1768 | 1.0 |
3.1194 | 4.9 | 1000 | 2.6401 | 1.0 |
1.4593 | 7.35 | 1500 | 0.5243 | 0.5960 |
0.7581 | 9.8 | 2000 | 0.3534 | 0.4432 |
0.5843 | 12.25 | 2500 | 0.3159 | 0.4157 |
0.4703 | 14.71 | 3000 | 0.3003 | 0.3668 |
0.4045 | 17.16 | 3500 | 0.2891 | 0.3414 |
0.3581 | 19.61 | 4000 | 0.2609 | 0.3207 |
0.3268 | 22.06 | 4500 | 0.2622 | 0.3207 |
0.3063 | 24.51 | 5000 | 0.2805 | 0.3193 |
0.2729 | 26.96 | 5500 | 0.2674 | 0.2884 |
0.249 | 29.41 | 6000 | 0.2740 | 0.2953 |
0.2275 | 31.86 | 6500 | 0.2729 | 0.2753 |
0.2295 | 34.31 | 7000 | 0.2801 | 0.2691 |
0.2105 | 36.76 | 7500 | 0.2992 | 0.2801 |
0.1905 | 39.22 | 8000 | 0.2967 | 0.2663 |
0.1884 | 41.67 | 8500 | 0.2911 | 0.2691 |
0.1773 | 44.12 | 9000 | 0.2966 | 0.2753 |
0.1672 | 46.57 | 9500 | 0.3051 | 0.2505 |
0.1632 | 49.02 | 10000 | 0.2872 | 0.2491 |
0.1553 | 51.47 | 10500 | 0.3121 | 0.2629 |
0.1619 | 53.92 | 11000 | 0.3044 | 0.2581 |
0.1444 | 56.37 | 11500 | 0.3135 | 0.2567 |
0.1451 | 58.82 | 12000 | 0.3033 | 0.2519 |
0.1386 | 61.27 | 12500 | 0.3079 | 0.2622 |
0.1261 | 63.73 | 13000 | 0.3037 | 0.2395 |
0.1287 | 66.18 | 13500 | 0.3221 | 0.2409 |
0.1236 | 68.63 | 14000 | 0.3179 | 0.2464 |
0.1215 | 71.08 | 14500 | 0.3521 | 0.2429 |
0.1208 | 73.53 | 15000 | 0.3481 | 0.2540 |
0.1128 | 75.98 | 15500 | 0.3288 | 0.2402 |
0.1108 | 78.43 | 16000 | 0.3238 | 0.2450 |
0.1074 | 80.88 | 16500 | 0.3178 | 0.2416 |
0.1086 | 83.33 | 17000 | 0.3461 | 0.2361 |
0.1059 | 85.78 | 17500 | 0.3342 | 0.2457 |
0.0981 | 88.24 | 18000 | 0.3382 | 0.2354 |
0.0995 | 90.69 | 18500 | 0.3466 | 0.2416 |
0.0995 | 93.14 | 19000 | 0.3326 | 0.2271 |
0.0929 | 95.59 | 19500 | 0.3526 | 0.2237 |
0.0944 | 98.04 | 20000 | 0.3516 | 0.2347 |
0.089 | 100.49 | 20500 | 0.3504 | 0.2271 |
0.0915 | 102.94 | 21000 | 0.3425 | 0.2285 |
0.0845 | 105.39 | 21500 | 0.3309 | 0.2306 |
0.0887 | 107.84 | 22000 | 0.3196 | 0.2264 |
0.0812 | 110.29 | 22500 | 0.3285 | 0.2264 |
0.0856 | 112.75 | 23000 | 0.3347 | 0.2251 |
0.0778 | 115.2 | 23500 | 0.3403 | 0.2271 |
0.0748 | 117.65 | 24000 | 0.3427 | 0.2278 |
0.0803 | 120.1 | 24500 | 0.3380 | 0.2223 |
0.0768 | 122.55 | 25000 | 0.3392 | 0.2189 |
0.0764 | 125.0 | 25500 | 0.3423 | 0.2278 |
0.0786 | 127.45 | 26000 | 0.3423 | 0.2230 |
0.0766 | 129.9 | 26500 | 0.3402 | 0.2237 |
Framework versions
- Transformers 4.22.2
- Pytorch 1.12.1+cu113
- Datasets 2.5.2
- Tokenizers 0.12.1