<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-base-timit-demo-google-colab
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5816
- Wer: 0.3533
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
2.243 | 0.5 | 500 | 1.0798 | 0.7752 |
0.834 | 1.01 | 1000 | 0.6206 | 0.5955 |
0.5503 | 1.51 | 1500 | 0.5387 | 0.5155 |
0.4548 | 2.01 | 2000 | 0.4660 | 0.4763 |
0.3412 | 2.51 | 2500 | 0.8381 | 0.4836 |
0.3128 | 3.02 | 3000 | 0.4818 | 0.4519 |
0.2547 | 3.52 | 3500 | 0.4415 | 0.4230 |
0.2529 | 4.02 | 4000 | 0.4624 | 0.4219 |
0.2103 | 4.52 | 4500 | 0.4714 | 0.4096 |
0.2102 | 5.03 | 5000 | 0.4968 | 0.4087 |
0.1838 | 5.53 | 5500 | 0.4643 | 0.4131 |
0.1721 | 6.03 | 6000 | 0.4676 | 0.3979 |
0.1548 | 6.53 | 6500 | 0.4765 | 0.4085 |
0.1595 | 7.04 | 7000 | 0.4797 | 0.3941 |
0.1399 | 7.54 | 7500 | 0.4753 | 0.3902 |
0.1368 | 8.04 | 8000 | 0.4697 | 0.3945 |
0.1276 | 8.54 | 8500 | 0.5438 | 0.3869 |
0.1255 | 9.05 | 9000 | 0.5660 | 0.3841 |
0.1077 | 9.55 | 9500 | 0.4964 | 0.3947 |
0.1197 | 10.05 | 10000 | 0.5349 | 0.3849 |
0.1014 | 10.55 | 10500 | 0.5558 | 0.3883 |
0.0949 | 11.06 | 11000 | 0.5673 | 0.3785 |
0.0882 | 11.56 | 11500 | 0.5589 | 0.3955 |
0.0906 | 12.06 | 12000 | 0.5752 | 0.4120 |
0.1064 | 12.56 | 12500 | 0.5080 | 0.3727 |
0.0854 | 13.07 | 13000 | 0.5398 | 0.3798 |
0.0754 | 13.57 | 13500 | 0.5237 | 0.3816 |
0.0791 | 14.07 | 14000 | 0.4967 | 0.3725 |
0.0731 | 14.57 | 14500 | 0.5287 | 0.3744 |
0.0719 | 15.08 | 15000 | 0.5633 | 0.3596 |
0.062 | 15.58 | 15500 | 0.5399 | 0.3752 |
0.0681 | 16.08 | 16000 | 0.5151 | 0.3759 |
0.0559 | 16.58 | 16500 | 0.5564 | 0.3709 |
0.0533 | 17.09 | 17000 | 0.5933 | 0.3743 |
0.0563 | 17.59 | 17500 | 0.5381 | 0.3670 |
0.0527 | 18.09 | 18000 | 0.5685 | 0.3731 |
0.0492 | 18.59 | 18500 | 0.5728 | 0.3725 |
0.0509 | 19.1 | 19000 | 0.6074 | 0.3807 |
0.0436 | 19.6 | 19500 | 0.5762 | 0.3628 |
0.0434 | 20.1 | 20000 | 0.6721 | 0.3729 |
0.0416 | 20.6 | 20500 | 0.5842 | 0.3700 |
0.0431 | 21.11 | 21000 | 0.5374 | 0.3607 |
0.037 | 21.61 | 21500 | 0.5556 | 0.3667 |
0.036 | 22.11 | 22000 | 0.5608 | 0.3592 |
0.04 | 22.61 | 22500 | 0.5272 | 0.3637 |
0.047 | 23.12 | 23000 | 0.5234 | 0.3625 |
0.0506 | 23.62 | 23500 | 0.5427 | 0.3629 |
0.0418 | 24.12 | 24000 | 0.5590 | 0.3626 |
0.037 | 24.62 | 24500 | 0.5615 | 0.3555 |
0.0429 | 25.13 | 25000 | 0.5806 | 0.3616 |
0.045 | 25.63 | 25500 | 0.5777 | 0.3639 |
0.0283 | 26.13 | 26000 | 0.5987 | 0.3617 |
0.0253 | 26.63 | 26500 | 0.5671 | 0.3551 |
0.032 | 27.14 | 27000 | 0.5464 | 0.3582 |
0.0321 | 27.64 | 27500 | 0.5634 | 0.3573 |
0.0274 | 28.14 | 28000 | 0.5513 | 0.3575 |
0.0245 | 28.64 | 28500 | 0.5745 | 0.3537 |
0.0251 | 29.15 | 29000 | 0.5759 | 0.3547 |
0.0222 | 29.65 | 29500 | 0.5816 | 0.3533 |
Framework versions
- Transformers 4.17.0
- Pytorch 1.11.0+cu113
- Datasets 1.18.3
- Tokenizers 0.12.1