generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

wav2vec2_100k_gtzan_30s_model

This model is a fine-tuned version of facebook/wav2vec2-base-100k-voxpopuli on the gtzan dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.3029 0.96 18 2.2991 0.13
2.3007 1.97 37 2.2918 0.19
2.291 2.99 56 2.2724 0.205
2.2699 4.0 75 2.2258 0.5
2.2279 4.96 93 2.1607 0.495
2.1577 5.97 112 2.0832 0.56
2.1112 6.99 131 2.0094 0.63
2.0496 8.0 150 1.9826 0.585
2.0083 8.96 168 1.9011 0.64
1.9334 9.97 187 1.8251 0.665
1.8594 10.99 206 1.7503 0.725
1.7628 12.0 225 1.6975 0.705
1.7181 12.96 243 1.6342 0.725
1.6974 13.97 262 1.6195 0.71
1.5742 14.99 281 1.5393 0.705
1.5287 16.0 300 1.4877 0.755
1.4624 16.96 318 1.4462 0.775
1.4166 17.97 337 1.4250 0.765
1.3848 18.99 356 1.3846 0.765
1.3146 20.0 375 1.3571 0.7
1.2487 20.96 393 1.3178 0.715
1.2577 21.97 412 1.2669 0.785
1.1904 22.99 431 1.3072 0.675
1.1622 24.0 450 1.1917 0.8
1.0907 24.96 468 1.2082 0.785
1.0616 25.97 487 1.1552 0.77
1.0685 26.99 506 1.1241 0.77
1.0347 28.0 525 1.0956 0.78
0.9509 28.96 543 1.1258 0.675
0.9214 29.97 562 1.0752 0.77
0.8702 30.99 581 0.9911 0.795
0.8051 32.0 600 0.9489 0.835
0.7605 32.96 618 0.9337 0.845
0.7375 33.97 637 0.9252 0.84
0.7216 34.99 656 0.9157 0.81
0.6805 36.0 675 0.9085 0.825
0.6951 36.96 693 0.9061 0.805
0.6449 37.97 712 0.8635 0.82
0.5744 38.99 731 0.9587 0.785
0.5572 40.0 750 0.8449 0.81
0.5612 40.96 768 0.8369 0.815
0.5587 41.97 787 0.8803 0.805
0.4815 42.99 806 0.8362 0.82
0.4959 44.0 825 0.8096 0.82
0.4814 44.96 843 0.8324 0.795
0.4919 45.97 862 0.8260 0.81
0.4346 46.99 881 0.7959 0.83
0.4054 48.0 900 0.8164 0.815
0.412 48.96 918 0.8323 0.805
0.3606 49.97 937 0.8643 0.79
0.397 50.99 956 0.7615 0.815
0.3617 52.0 975 0.6882 0.845
0.3149 52.96 993 0.6932 0.855
0.3533 53.97 1012 0.7074 0.85
0.3571 54.99 1031 0.7530 0.82
0.2958 56.0 1050 0.7798 0.835
0.3252 56.96 1068 0.7529 0.84
0.2765 57.97 1087 0.6861 0.87
0.2507 58.99 1106 0.7312 0.84
0.2244 60.0 1125 0.7683 0.82
0.235 60.96 1143 0.7951 0.82
0.253 61.97 1162 0.7510 0.835
0.2315 62.99 1181 0.6601 0.865
0.1917 64.0 1200 0.7012 0.84
0.2324 64.96 1218 0.8034 0.835
0.1872 65.97 1237 0.7210 0.845
0.1637 66.99 1256 0.7287 0.835
0.204 68.0 1275 0.8114 0.82
0.1542 68.96 1293 0.7838 0.825
0.1917 69.97 1312 0.6852 0.86
0.1574 70.99 1331 0.7114 0.85
0.1504 72.0 1350 0.7699 0.84
0.1462 72.96 1368 0.7432 0.835
0.1429 73.97 1387 0.7172 0.855
0.1063 74.99 1406 0.7108 0.855
0.17 76.0 1425 0.6909 0.855
0.1329 76.96 1443 0.7127 0.85
0.1316 77.97 1462 0.7241 0.845
0.1106 78.99 1481 0.7457 0.84
0.1317 80.0 1500 0.6769 0.85
0.1245 80.96 1518 0.7100 0.84
0.1123 81.97 1537 0.7295 0.85
0.1343 82.99 1556 0.7960 0.83
0.1038 84.0 1575 0.7629 0.835
0.1732 84.96 1593 0.7197 0.845
0.0968 85.97 1612 0.7066 0.855
0.116 86.99 1631 0.7322 0.845
0.1127 88.0 1650 0.7619 0.85
0.1043 88.96 1668 0.7250 0.85
0.0946 89.97 1687 0.7809 0.83
0.1108 90.99 1706 0.7694 0.835
0.1031 92.0 1725 0.7746 0.83
0.0821 92.96 1743 0.8138 0.825
0.0986 93.97 1762 0.8004 0.84
0.1078 94.99 1781 0.7557 0.85
0.0944 96.0 1800 0.7521 0.85

Framework versions