<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-base-timit-demo-colab
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4772
- Wer: 0.2821
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
3.6949 | 0.87 | 500 | 2.4599 | 0.9999 |
0.9858 | 1.73 | 1000 | 0.5249 | 0.4674 |
0.4645 | 2.6 | 1500 | 0.4604 | 0.3900 |
0.3273 | 3.46 | 2000 | 0.3939 | 0.3612 |
0.2474 | 4.33 | 2500 | 0.4150 | 0.3560 |
0.2191 | 5.19 | 3000 | 0.3855 | 0.3344 |
0.1662 | 6.06 | 3500 | 0.3779 | 0.3258 |
0.1669 | 6.92 | 4000 | 0.4841 | 0.3286 |
0.151 | 7.79 | 4500 | 0.4182 | 0.3219 |
0.1175 | 8.65 | 5000 | 0.4194 | 0.3107 |
0.1103 | 9.52 | 5500 | 0.4256 | 0.3129 |
0.1 | 10.38 | 6000 | 0.4352 | 0.3089 |
0.0949 | 11.25 | 6500 | 0.4649 | 0.3160 |
0.0899 | 12.11 | 7000 | 0.4472 | 0.3065 |
0.0787 | 12.98 | 7500 | 0.4763 | 0.3128 |
0.0742 | 13.84 | 8000 | 0.4321 | 0.3034 |
0.067 | 14.71 | 8500 | 0.4562 | 0.3076 |
0.063 | 15.57 | 9000 | 0.4541 | 0.3102 |
0.0624 | 16.44 | 9500 | 0.5113 | 0.3040 |
0.0519 | 17.3 | 10000 | 0.4925 | 0.3008 |
0.0525 | 18.17 | 10500 | 0.4710 | 0.2987 |
0.046 | 19.03 | 11000 | 0.4781 | 0.2977 |
0.0455 | 19.9 | 11500 | 0.4572 | 0.2969 |
0.0394 | 20.76 | 12000 | 0.5256 | 0.2966 |
0.0373 | 21.63 | 12500 | 0.4723 | 0.2921 |
0.0375 | 22.49 | 13000 | 0.4640 | 0.2847 |
0.0334 | 23.36 | 13500 | 0.4740 | 0.2917 |
0.0304 | 24.22 | 14000 | 0.4817 | 0.2874 |
0.0291 | 25.09 | 14500 | 0.4722 | 0.2896 |
0.0247 | 25.95 | 15000 | 0.4765 | 0.2870 |
0.0223 | 26.82 | 15500 | 0.4728 | 0.2821 |
0.0223 | 27.68 | 16000 | 0.4690 | 0.2834 |
0.0207 | 28.55 | 16500 | 0.4706 | 0.2825 |
0.0186 | 29.41 | 17000 | 0.4772 | 0.2821 |
Framework versions
- Transformers 4.17.0
- Pytorch 1.12.0+cu113
- Datasets 1.18.3
- Tokenizers 0.12.1