<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
my_awesome_asr_mind_model_m
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 6.1679
- Wer: 1.1256
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- training_steps: 26000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
3.4698 | 200.0 | 1000 | 3.4251 | 1.0 |
2.8391 | 400.0 | 2000 | 2.9903 | 1.0 |
2.5123 | 600.0 | 3000 | 2.9246 | 1.0 |
2.0536 | 800.0 | 4000 | 2.9389 | 0.9955 |
1.4949 | 1000.0 | 5000 | 3.1124 | 0.9776 |
1.0579 | 1200.0 | 6000 | 3.6074 | 1.0179 |
0.7578 | 1400.0 | 7000 | 3.8384 | 1.0314 |
0.5393 | 1600.0 | 8000 | 4.2573 | 1.0448 |
0.3935 | 1800.0 | 9000 | 4.4847 | 1.0448 |
0.3049 | 2000.0 | 10000 | 5.0460 | 1.0852 |
0.2482 | 2200.0 | 11000 | 4.9277 | 1.0807 |
0.2027 | 2400.0 | 12000 | 5.3025 | 1.1121 |
0.1653 | 2600.0 | 13000 | 5.5421 | 1.1031 |
0.1496 | 2800.0 | 14000 | 5.6074 | 1.0762 |
0.1221 | 3000.0 | 15000 | 5.4166 | 1.1031 |
0.1121 | 3200.0 | 16000 | 5.8223 | 1.1300 |
0.097 | 3400.0 | 17000 | 5.8823 | 1.1525 |
0.0908 | 3600.0 | 18000 | 5.8555 | 1.1211 |
0.0853 | 3800.0 | 19000 | 6.0204 | 1.1256 |
0.0743 | 4000.0 | 20000 | 5.8911 | 1.1076 |
0.0735 | 4200.0 | 21000 | 6.1458 | 1.1076 |
0.0701 | 4400.0 | 22000 | 6.1057 | 1.1166 |
0.0657 | 4600.0 | 23000 | 6.0016 | 1.1031 |
0.0659 | 4800.0 | 24000 | 6.0810 | 1.1256 |
0.0621 | 5000.0 | 25000 | 6.1174 | 1.1256 |
0.0602 | 5200.0 | 26000 | 6.1679 | 1.1256 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1
- Datasets 2.14.5
- Tokenizers 0.13.3