<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xls-r-300m-Arabic-phoneme-undicretics-native-non-native
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2776
- Per: 0.3216
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 250
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Per |
---|---|---|---|---|
6.4994 | 0.99 | 63 | 2.7517 | 1.0 |
2.3232 | 1.99 | 127 | 2.1722 | 1.0 |
2.0242 | 3.0 | 191 | 2.1322 | 1.0 |
2.0153 | 4.0 | 255 | 2.1381 | 0.9942 |
2.2035 | 4.99 | 318 | 2.1159 | 0.9964 |
1.9905 | 5.99 | 382 | 2.1363 | 0.9942 |
1.9629 | 7.0 | 446 | 2.0447 | 0.9828 |
1.9475 | 8.0 | 510 | 2.1098 | 0.9942 |
1.8434 | 8.99 | 573 | 1.7443 | 0.9942 |
1.7291 | 9.99 | 637 | 1.7329 | 0.9909 |
1.7045 | 11.0 | 701 | 1.7493 | 0.9942 |
1.6932 | 12.0 | 765 | 1.7182 | 0.9942 |
1.6725 | 12.99 | 828 | 1.6675 | 0.9942 |
1.6567 | 13.99 | 892 | 1.6564 | 0.9939 |
1.6316 | 15.0 | 956 | 1.6156 | 0.9913 |
1.6037 | 16.0 | 1020 | 1.5993 | 0.9924 |
1.5907 | 16.99 | 1083 | 1.5591 | 0.9952 |
1.5663 | 17.99 | 1147 | 1.5204 | 0.9909 |
1.502 | 19.0 | 1211 | 1.4375 | 0.9895 |
1.4358 | 20.0 | 1275 | 1.3073 | 0.9883 |
1.3647 | 20.99 | 1338 | 1.2088 | 0.9768 |
1.1395 | 21.99 | 1402 | 1.0489 | 0.9623 |
1.0518 | 23.0 | 1466 | 0.8761 | 0.9380 |
0.9232 | 24.0 | 1530 | 0.7407 | 0.8915 |
0.8209 | 24.99 | 1593 | 0.6368 | 0.8007 |
0.6394 | 25.99 | 1657 | 0.5692 | 0.6804 |
0.5786 | 27.0 | 1721 | 0.4750 | 0.5382 |
0.5173 | 28.0 | 1785 | 0.4252 | 0.4132 |
0.4707 | 28.99 | 1848 | 0.3854 | 0.3962 |
0.4516 | 29.65 | 1890 | 0.3751 | 0.3847 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3