<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xlsr-53-torgo-demo-f03-nolm
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0220
- Wer: 0.5303
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
3.3935 | 0.97 | 500 | 4.2877 | 1.0 |
3.0106 | 1.94 | 1000 | 3.1321 | 1.0 |
2.9323 | 2.91 | 1500 | 3.2662 | 1.0 |
2.7359 | 3.88 | 2000 | 3.2751 | 1.0 |
2.4803 | 4.85 | 2500 | 2.5467 | 1.3920 |
2.0298 | 5.83 | 3000 | 1.8645 | 1.3701 |
1.4024 | 6.8 | 3500 | 1.1290 | 1.3445 |
1.1126 | 7.77 | 4000 | 0.7792 | 1.2031 |
0.8674 | 8.74 | 4500 | 0.5452 | 1.0921 |
0.668 | 9.71 | 5000 | 0.3812 | 0.9847 |
0.5972 | 10.68 | 5500 | 0.3118 | 0.9197 |
0.5227 | 11.65 | 6000 | 0.2359 | 0.8489 |
0.4032 | 12.62 | 6500 | 0.1690 | 0.7781 |
0.3923 | 13.59 | 7000 | 0.1490 | 0.7383 |
0.3964 | 14.56 | 7500 | 0.1274 | 0.7069 |
0.2967 | 15.53 | 8000 | 0.1158 | 0.6881 |
0.2989 | 16.5 | 8500 | 0.0884 | 0.6520 |
0.274 | 17.48 | 9000 | 0.0875 | 0.6510 |
0.2968 | 18.45 | 9500 | 0.0701 | 0.6314 |
0.1968 | 19.42 | 10000 | 0.0666 | 0.6117 |
0.2212 | 20.39 | 10500 | 0.0556 | 0.5985 |
0.2104 | 21.36 | 11000 | 0.0493 | 0.5865 |
0.1999 | 22.33 | 11500 | 0.0409 | 0.5701 |
0.2085 | 23.3 | 12000 | 0.0412 | 0.5690 |
0.1842 | 24.27 | 12500 | 0.0338 | 0.5528 |
0.1588 | 25.24 | 13000 | 0.0324 | 0.5540 |
0.1558 | 26.21 | 13500 | 0.0284 | 0.5454 |
0.1505 | 27.18 | 14000 | 0.0262 | 0.5350 |
0.1741 | 28.16 | 14500 | 0.0232 | 0.5291 |
0.158 | 29.13 | 15000 | 0.0220 | 0.5303 |
Framework versions
- Transformers 4.23.1
- Pytorch 1.12.1+cu113
- Datasets 2.0.0
- Tokenizers 0.13.2