<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-demo-M04-2
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.0168
- Wer: 1.2882
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
21.8298 | 0.88 | 500 | 3.2643 | 1.0 |
3.2319 | 1.75 | 1000 | 2.8027 | 1.0 |
2.769 | 2.63 | 1500 | 2.4684 | 1.0 |
2.0823 | 3.5 | 2000 | 1.9137 | 1.6482 |
1.3094 | 4.38 | 2500 | 1.7267 | 1.6094 |
0.9654 | 5.25 | 3000 | 1.7523 | 1.4882 |
0.7505 | 6.13 | 3500 | 1.5588 | 1.5353 |
0.6364 | 7.01 | 4000 | 1.5428 | 1.4706 |
0.5307 | 7.88 | 4500 | 1.6277 | 1.4765 |
0.4664 | 8.76 | 5000 | 1.6817 | 1.3718 |
0.4243 | 9.63 | 5500 | 1.7682 | 1.4541 |
0.3911 | 10.51 | 6000 | 1.8567 | 1.4094 |
0.3555 | 11.38 | 6500 | 1.7248 | 1.3694 |
0.3252 | 12.26 | 7000 | 1.8712 | 1.4012 |
0.3072 | 13.13 | 7500 | 2.0088 | 1.4424 |
0.2956 | 14.01 | 8000 | 1.8649 | 1.3576 |
0.283 | 14.89 | 8500 | 1.8951 | 1.4035 |
0.2682 | 15.76 | 9000 | 1.8762 | 1.3976 |
0.2465 | 16.64 | 9500 | 1.8406 | 1.34 |
0.2344 | 17.51 | 10000 | 1.9975 | 1.3294 |
0.2269 | 18.39 | 10500 | 1.9207 | 1.3176 |
0.2053 | 19.26 | 11000 | 2.0406 | 1.3412 |
0.1934 | 20.14 | 11500 | 1.9039 | 1.2859 |
0.2018 | 21.02 | 12000 | 1.8337 | 1.3212 |
0.169 | 21.89 | 12500 | 1.9120 | 1.3071 |
0.1742 | 22.77 | 13000 | 2.0650 | 1.3153 |
0.1571 | 23.64 | 13500 | 2.0369 | 1.3165 |
0.1403 | 24.52 | 14000 | 2.0420 | 1.2894 |
0.1474 | 25.39 | 14500 | 1.9529 | 1.2847 |
0.1373 | 26.27 | 15000 | 2.0818 | 1.3129 |
0.1222 | 27.15 | 15500 | 1.9551 | 1.2753 |
0.1182 | 28.02 | 16000 | 2.0138 | 1.2659 |
0.1357 | 28.9 | 16500 | 1.9976 | 1.2859 |
0.1158 | 29.77 | 17000 | 2.0168 | 1.2882 |
Framework versions
- Transformers 4.23.1
- Pytorch 1.12.1+cu113
- Datasets 1.18.3
- Tokenizers 0.13.2