<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-large-xls-r-300m-Arabic-phoneme-undicretics
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2356
- Per: 0.2966
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 250
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Per |
---|---|---|---|---|
2.4047 | 1.0 | 102 | 2.1371 | 1.0 |
2.0311 | 2.0 | 204 | 2.0476 | 1.0 |
2.0117 | 2.99 | 306 | 2.0150 | 0.9943 |
1.9689 | 4.0 | 409 | 1.9943 | 0.9943 |
2.0006 | 5.0 | 511 | 2.0440 | 0.9943 |
1.9945 | 6.0 | 613 | 2.0736 | 0.9943 |
1.9959 | 6.99 | 715 | 2.0254 | 0.9942 |
1.9911 | 8.0 | 818 | 2.0101 | 0.9827 |
2.0827 | 9.0 | 920 | 2.0403 | 0.9943 |
1.986 | 10.0 | 1022 | 1.9916 | 0.9980 |
1.9595 | 10.99 | 1124 | 1.9558 | 0.9828 |
1.9381 | 12.0 | 1227 | 1.8902 | 0.9966 |
1.7657 | 13.0 | 1329 | 1.6801 | 0.9946 |
1.6872 | 14.0 | 1431 | 1.6852 | 0.9943 |
1.6566 | 14.99 | 1533 | 1.6286 | 0.9942 |
1.6135 | 16.0 | 1636 | 1.6017 | 0.9927 |
1.5798 | 17.0 | 1738 | 1.5355 | 0.9852 |
1.5259 | 18.0 | 1840 | 1.4729 | 0.9856 |
1.4441 | 18.99 | 1942 | 1.3440 | 0.9807 |
1.3541 | 20.0 | 2045 | 1.2208 | 0.9755 |
1.2349 | 21.0 | 2147 | 1.0838 | 0.9706 |
1.1087 | 22.0 | 2249 | 0.8575 | 0.9623 |
0.8635 | 22.99 | 2351 | 0.7175 | 0.9015 |
0.7591 | 24.0 | 2454 | 0.6055 | 0.8330 |
0.6543 | 25.0 | 2556 | 0.4737 | 0.7052 |
0.5586 | 26.0 | 2658 | 0.3902 | 0.5189 |
0.4762 | 26.99 | 2760 | 0.3190 | 0.3832 |
0.4417 | 28.0 | 2863 | 0.2760 | 0.3200 |
0.3994 | 29.0 | 2965 | 0.2448 | 0.2687 |
0.3836 | 29.93 | 3060 | 0.2356 | 0.2966 |
Framework versions
- Transformers 4.29.2
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3