<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xls-r-romansh-colab
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice_13_0 dataset. It achieves the following results on the evaluation set:
- Loss: 0.3954
- Wer: 0.3372
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
13.0876 | 0.38 | 50 | 4.9608 | 1.0 |
4.0129 | 0.76 | 100 | 3.1470 | 1.0 |
3.0629 | 1.14 | 150 | 3.0178 | 1.0 |
3.0276 | 1.52 | 200 | 3.0792 | 1.0 |
3.0054 | 1.9 | 250 | 2.9988 | 1.0 |
3.0259 | 2.29 | 300 | 3.0606 | 1.0 |
2.9827 | 2.67 | 350 | 3.0001 | 1.0 |
3.0394 | 3.05 | 400 | 3.0010 | 1.0 |
2.97 | 3.43 | 450 | 2.9920 | 1.0 |
3.0238 | 3.81 | 500 | 2.9967 | 1.0 |
2.9976 | 4.2 | 550 | 2.9906 | 1.0 |
3.0268 | 4.58 | 600 | 2.9893 | 1.0 |
2.9899 | 4.96 | 650 | 2.9904 | 1.0 |
3.0395 | 5.34 | 700 | 2.9889 | 1.0 |
2.9797 | 5.72 | 750 | 2.9912 | 1.0 |
3.002 | 6.11 | 800 | 2.9816 | 1.0 |
3.0043 | 6.49 | 850 | 2.9881 | 1.0 |
2.9599 | 6.87 | 900 | 2.9814 | 1.0 |
3.0148 | 7.25 | 950 | 2.9962 | 1.0 |
2.9611 | 7.63 | 1000 | 2.9739 | 1.0 |
3.0657 | 8.02 | 1050 | 2.9980 | 1.0 |
2.9598 | 8.4 | 1100 | 2.9725 | 1.0 |
2.9749 | 8.78 | 1150 | 2.9997 | 1.0 |
2.9801 | 9.16 | 1200 | 2.9577 | 1.0 |
2.9597 | 9.54 | 1250 | 2.9457 | 1.0 |
2.9335 | 9.92 | 1300 | 2.9349 | 1.0 |
2.9576 | 10.3 | 1350 | 2.9185 | 1.0 |
2.8992 | 10.68 | 1400 | 2.7701 | 1.0 |
2.7045 | 11.07 | 1450 | 2.2958 | 1.0 |
2.0933 | 11.45 | 1500 | 1.4031 | 0.9998 |
1.5523 | 11.83 | 1550 | 1.0655 | 0.9029 |
1.3192 | 12.21 | 1600 | 0.9047 | 0.7736 |
1.1209 | 12.59 | 1650 | 0.7879 | 0.6763 |
1.0312 | 12.97 | 1700 | 0.7086 | 0.6616 |
0.9216 | 13.36 | 1750 | 0.6601 | 0.6118 |
0.8778 | 13.74 | 1800 | 0.6042 | 0.5971 |
0.7868 | 14.12 | 1850 | 0.5748 | 0.5675 |
0.7491 | 14.5 | 1900 | 0.5503 | 0.5484 |
0.7181 | 14.88 | 1950 | 0.5365 | 0.5240 |
0.7099 | 15.27 | 2000 | 0.5032 | 0.4984 |
0.6294 | 15.65 | 2050 | 0.4871 | 0.4900 |
0.6283 | 16.03 | 2100 | 0.4667 | 0.4853 |
0.5798 | 16.41 | 2150 | 0.4702 | 0.4690 |
0.5826 | 16.79 | 2200 | 0.4708 | 0.4555 |
0.5622 | 17.17 | 2250 | 0.4682 | 0.4537 |
0.5244 | 17.56 | 2300 | 0.4468 | 0.4353 |
0.5177 | 17.94 | 2350 | 0.4523 | 0.4432 |
0.469 | 18.32 | 2400 | 0.4368 | 0.4152 |
0.4963 | 18.7 | 2450 | 0.4260 | 0.4094 |
0.4644 | 19.08 | 2500 | 0.4250 | 0.3905 |
0.4588 | 19.46 | 2550 | 0.4265 | 0.3952 |
0.4273 | 19.84 | 2600 | 0.4357 | 0.3961 |
0.4519 | 20.23 | 2650 | 0.4196 | 0.3819 |
0.4161 | 20.61 | 2700 | 0.4192 | 0.3880 |
0.4205 | 20.99 | 2750 | 0.4137 | 0.3761 |
0.3993 | 21.37 | 2800 | 0.4216 | 0.3843 |
0.3937 | 21.75 | 2850 | 0.4189 | 0.3798 |
0.3767 | 22.14 | 2900 | 0.4130 | 0.3719 |
0.3879 | 22.52 | 2950 | 0.4004 | 0.3619 |
0.385 | 22.9 | 3000 | 0.4112 | 0.3605 |
0.3859 | 23.28 | 3050 | 0.4042 | 0.3591 |
0.3743 | 23.66 | 3100 | 0.4197 | 0.3703 |
0.3385 | 24.05 | 3150 | 0.3952 | 0.3510 |
0.3405 | 24.43 | 3200 | 0.3935 | 0.3537 |
0.363 | 24.81 | 3250 | 0.3908 | 0.3463 |
0.3257 | 25.19 | 3300 | 0.3972 | 0.3421 |
0.3487 | 25.57 | 3350 | 0.3991 | 0.3433 |
0.3478 | 25.95 | 3400 | 0.4119 | 0.3563 |
0.3435 | 26.33 | 3450 | 0.3948 | 0.3435 |
0.3346 | 26.71 | 3500 | 0.4106 | 0.3449 |
0.3093 | 27.1 | 3550 | 0.4008 | 0.3405 |
0.3304 | 27.48 | 3600 | 0.4025 | 0.3416 |
0.3335 | 27.86 | 3650 | 0.3950 | 0.3386 |
0.3179 | 28.24 | 3700 | 0.3924 | 0.3374 |
0.3141 | 28.62 | 3750 | 0.3928 | 0.3370 |
0.3335 | 29.01 | 3800 | 0.3965 | 0.3379 |
0.3198 | 29.39 | 3850 | 0.3949 | 0.3370 |
0.3201 | 29.77 | 3900 | 0.3954 | 0.3372 |
Framework versions
- Transformers 4.26.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.4
- Tokenizers 0.13.3