generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

wav2vec2-xls-r-300m-arabic_speech_commands_10s

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy
3.6904 1.0 25 3.6889 0.0292
3.6892 2.0 50 3.6876 0.025
3.6902 3.0 75 3.6707 0.0792
3.6133 4.0 100 3.5175 0.0917
3.4989 5.0 125 3.3043 0.0917
3.1971 6.0 150 3.1385 0.125
3.1348 7.0 175 2.8781 0.2
2.8271 8.0 200 2.6197 0.2417
2.7187 9.0 225 2.4002 0.2417
2.3602 10.0 250 2.1351 0.4167
2.1853 11.0 275 1.9951 0.4042
2.0971 12.0 300 2.0668 0.3667
1.8549 13.0 325 1.7583 0.4792
1.6929 14.0 350 1.5585 0.5542
1.4449 15.0 375 1.5602 0.5292
1.5174 16.0 400 1.4584 0.6
1.4283 17.0 425 1.3407 0.65
1.2341 18.0 450 1.1690 0.6708
1.2353 19.0 475 1.0808 0.7542
1.0083 20.0 500 0.8947 0.7917
0.8907 21.0 525 0.9383 0.7958
0.9161 22.0 550 0.7948 0.7917
0.781 23.0 575 0.7135 0.8375
0.7709 24.0 600 0.8259 0.8208
0.5534 25.0 625 0.6845 0.7958
0.6089 26.0 650 0.8161 0.7875
0.5412 27.0 675 0.5962 0.8292
0.4843 28.0 700 0.5888 0.8792
0.5755 29.0 725 0.5389 0.8417
0.4687 30.0 750 0.5176 0.8917
0.4191 31.0 775 0.4904 0.8542
0.4361 32.0 800 0.5360 0.8875
0.264 33.0 825 0.4501 0.8958
0.3062 34.0 850 0.5384 0.8917
0.2992 35.0 875 0.4840 0.9167
0.3904 36.0 900 0.4934 0.8792
0.2689 37.0 925 0.3123 0.925
0.1963 38.0 950 0.4691 0.8875
0.2402 39.0 975 0.4508 0.8792
0.1912 40.0 1000 0.3873 0.8917
0.1512 41.0 1025 0.3153 0.9208
0.1673 42.0 1050 0.2599 0.9417
0.1981 43.0 1075 0.5351 0.8917
0.1977 44.0 1100 0.4595 0.9042
0.2621 45.0 1125 0.2737 0.9375
0.1239 46.0 1150 0.1870 0.95
0.2602 47.0 1175 0.2738 0.9167
0.0681 48.0 1200 0.2707 0.9375
0.1257 49.0 1225 0.2281 0.95
0.1242 50.0 1250 0.2846 0.925
0.1169 51.0 1275 0.2766 0.9167
0.1701 52.0 1300 0.1858 0.9417
0.185 53.0 1325 0.3373 0.9292
0.0383 54.0 1350 0.3524 0.9208
0.0808 55.0 1375 0.3378 0.925
0.1444 56.0 1400 0.2609 0.9292
0.0798 57.0 1425 0.2635 0.9417
0.0324 58.0 1450 0.2550 0.9375
0.0669 59.0 1475 0.2466 0.9458
0.1389 60.0 1500 0.1992 0.95
0.0432 61.0 1525 0.2165 0.95
0.2076 62.0 1550 0.2718 0.9417
0.015 63.0 1575 0.2631 0.9458
0.0565 64.0 1600 0.2481 0.9417
0.0261 65.0 1625 0.2125 0.95
0.0136 66.0 1650 0.2464 0.95
0.0129 67.0 1675 0.2028 0.9542
0.1424 68.0 1700 0.1805 0.9542
0.0894 69.0 1725 0.2104 0.9458
0.04 70.0 1750 0.1842 0.9542
0.0424 71.0 1775 0.2014 0.9583
0.0364 72.0 1800 0.2029 0.9625
0.0112 73.0 1825 0.1695 0.9583
0.034 74.0 1850 0.1917 0.95
0.0253 75.0 1875 0.2004 0.9542
0.0089 76.0 1900 0.1981 0.9542
0.014 77.0 1925 0.1930 0.9542
0.0043 78.0 1950 0.1968 0.9417
0.0237 79.0 1975 0.1906 0.95
0.0031 80.0 2000 0.1904 0.95

Framework versions