<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-xls-r-300m-arabic_speech_commands_10s_one_speaker_5_classes_unknown
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7951
- Accuracy: 0.8278
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 80
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.8 | 3 | 1.7933 | 0.2306 |
No log | 1.8 | 6 | 1.7927 | 0.1667 |
No log | 2.8 | 9 | 1.7908 | 0.1667 |
2.1874 | 3.8 | 12 | 1.7836 | 0.225 |
2.1874 | 4.8 | 15 | 1.7728 | 0.3083 |
2.1874 | 5.8 | 18 | 1.6857 | 0.3889 |
2.0742 | 6.8 | 21 | 1.4554 | 0.7333 |
2.0742 | 7.8 | 24 | 1.2621 | 0.6861 |
2.0742 | 8.8 | 27 | 1.0360 | 0.7528 |
1.5429 | 9.8 | 30 | 1.0220 | 0.6472 |
1.5429 | 10.8 | 33 | 0.7951 | 0.8278 |
1.5429 | 11.8 | 36 | 0.7954 | 0.8111 |
1.5429 | 12.8 | 39 | 0.6698 | 0.8167 |
0.927 | 13.8 | 42 | 0.8400 | 0.6694 |
0.927 | 14.8 | 45 | 0.7026 | 0.7194 |
0.927 | 15.8 | 48 | 0.7232 | 0.6944 |
0.429 | 16.8 | 51 | 0.6640 | 0.7333 |
0.429 | 17.8 | 54 | 1.1750 | 0.6 |
0.429 | 18.8 | 57 | 0.9270 | 0.6722 |
0.2583 | 19.8 | 60 | 1.4541 | 0.5417 |
0.2583 | 20.8 | 63 | 1.8917 | 0.4472 |
0.2583 | 21.8 | 66 | 1.3213 | 0.6472 |
0.2583 | 22.8 | 69 | 1.3114 | 0.7 |
0.1754 | 23.8 | 72 | 0.8079 | 0.7389 |
0.1754 | 24.8 | 75 | 1.6070 | 0.4861 |
0.1754 | 25.8 | 78 | 1.6949 | 0.5083 |
0.1348 | 26.8 | 81 | 1.4364 | 0.6472 |
0.1348 | 27.8 | 84 | 0.9045 | 0.7889 |
0.1348 | 28.8 | 87 | 1.1878 | 0.7111 |
0.0634 | 29.8 | 90 | 0.9678 | 0.7667 |
0.0634 | 30.8 | 93 | 0.9572 | 0.7889 |
0.0634 | 31.8 | 96 | 0.8931 | 0.8139 |
0.0634 | 32.8 | 99 | 1.4805 | 0.6583 |
0.1267 | 33.8 | 102 | 2.6092 | 0.4778 |
0.1267 | 34.8 | 105 | 2.2933 | 0.5306 |
0.1267 | 35.8 | 108 | 1.9648 | 0.6083 |
0.0261 | 36.8 | 111 | 1.8385 | 0.65 |
0.0261 | 37.8 | 114 | 2.0328 | 0.6028 |
0.0261 | 38.8 | 117 | 2.3722 | 0.55 |
0.041 | 39.8 | 120 | 2.7606 | 0.4917 |
0.041 | 40.8 | 123 | 2.5793 | 0.5056 |
0.041 | 41.8 | 126 | 2.0967 | 0.5917 |
0.041 | 42.8 | 129 | 1.7498 | 0.6611 |
0.1004 | 43.8 | 132 | 1.6564 | 0.6722 |
0.1004 | 44.8 | 135 | 1.7533 | 0.6583 |
0.1004 | 45.8 | 138 | 2.3335 | 0.5806 |
0.0688 | 46.8 | 141 | 2.9578 | 0.4778 |
0.0688 | 47.8 | 144 | 3.2396 | 0.4472 |
0.0688 | 48.8 | 147 | 3.2100 | 0.4528 |
0.0082 | 49.8 | 150 | 3.2018 | 0.4472 |
0.0082 | 50.8 | 153 | 3.1985 | 0.45 |
0.0082 | 51.8 | 156 | 2.6950 | 0.525 |
0.0082 | 52.8 | 159 | 2.2335 | 0.6056 |
0.0159 | 53.8 | 162 | 2.0467 | 0.6306 |
0.0159 | 54.8 | 165 | 1.8858 | 0.6583 |
0.0159 | 55.8 | 168 | 1.8239 | 0.6694 |
0.0083 | 56.8 | 171 | 1.7927 | 0.675 |
0.0083 | 57.8 | 174 | 1.7636 | 0.6861 |
0.0083 | 58.8 | 177 | 1.7792 | 0.675 |
0.0645 | 59.8 | 180 | 1.9165 | 0.6611 |
0.0645 | 60.8 | 183 | 2.0780 | 0.6361 |
0.0645 | 61.8 | 186 | 2.2058 | 0.6028 |
0.0645 | 62.8 | 189 | 2.3011 | 0.5944 |
0.01 | 63.8 | 192 | 2.4047 | 0.5722 |
0.01 | 64.8 | 195 | 2.4870 | 0.5639 |
0.01 | 65.8 | 198 | 2.5513 | 0.5417 |
0.008 | 66.8 | 201 | 2.5512 | 0.5333 |
0.008 | 67.8 | 204 | 2.3419 | 0.5778 |
0.008 | 68.8 | 207 | 2.2424 | 0.5944 |
0.0404 | 69.8 | 210 | 2.2009 | 0.6167 |
0.0404 | 70.8 | 213 | 2.1788 | 0.6278 |
0.0404 | 71.8 | 216 | 2.1633 | 0.6306 |
0.0404 | 72.8 | 219 | 2.1525 | 0.6306 |
0.0106 | 73.8 | 222 | 2.1435 | 0.6389 |
0.0106 | 74.8 | 225 | 2.1391 | 0.6389 |
0.0106 | 75.8 | 228 | 2.1327 | 0.6472 |
0.0076 | 76.8 | 231 | 2.1287 | 0.6444 |
0.0076 | 77.8 | 234 | 2.1267 | 0.65 |
0.0076 | 78.8 | 237 | 2.1195 | 0.6472 |
0.0097 | 79.8 | 240 | 2.1182 | 0.6417 |
Framework versions
- Transformers 4.21.1
- Pytorch 1.12.1
- Datasets 2.4.0
- Tokenizers 0.12.1