<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
mtg_1000_subset_wav2vec2_100k_gtzan_model
This model is a fine-tuned version of facebook/wav2vec2-base-100k-voxpopuli on the gtzan dataset. It achieves the following results on the evaluation set:
- Loss: 0.9871
- Accuracy: 0.815
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
2.2999 | 1.0 | 199 | 2.2945 | 0.135 |
2.2959 | 2.0 | 399 | 2.2945 | 0.135 |
2.2969 | 3.0 | 599 | 2.2945 | 0.135 |
2.2985 | 4.0 | 799 | 2.2938 | 0.12 |
2.1449 | 5.0 | 998 | 2.0673 | 0.32 |
2.1164 | 6.0 | 1198 | 2.0673 | 0.32 |
2.1542 | 7.0 | 1398 | 2.0673 | 0.32 |
2.0971 | 8.0 | 1598 | 2.0631 | 0.315 |
2.0703 | 9.0 | 1797 | 1.9101 | 0.375 |
1.939 | 10.0 | 1997 | 1.9101 | 0.375 |
1.9502 | 11.0 | 2197 | 1.9101 | 0.375 |
1.9831 | 12.0 | 2397 | 1.9083 | 0.365 |
1.7969 | 13.0 | 2596 | 1.7547 | 0.42 |
1.817 | 14.0 | 2796 | 1.7547 | 0.42 |
1.9083 | 15.0 | 2996 | 1.7547 | 0.42 |
1.7962 | 16.0 | 3196 | 1.7654 | 0.395 |
1.5644 | 17.0 | 3395 | 1.5647 | 0.47 |
1.5798 | 18.0 | 3595 | 1.5647 | 0.47 |
1.583 | 19.0 | 3795 | 1.5647 | 0.47 |
1.6343 | 20.0 | 3995 | 1.5708 | 0.455 |
1.4268 | 21.0 | 4194 | 1.6466 | 0.405 |
1.8691 | 22.0 | 4394 | 1.6466 | 0.405 |
1.5165 | 23.0 | 4594 | 1.6466 | 0.405 |
1.5306 | 24.0 | 4794 | 1.6108 | 0.43 |
1.4245 | 25.0 | 4993 | 1.3500 | 0.54 |
1.2792 | 26.0 | 5193 | 1.3500 | 0.54 |
1.3472 | 27.0 | 5393 | 1.3500 | 0.54 |
1.2189 | 28.0 | 5593 | 1.3610 | 0.525 |
1.2861 | 29.0 | 5792 | 1.2593 | 0.595 |
1.2574 | 30.0 | 5992 | 1.2593 | 0.595 |
1.342 | 31.0 | 6192 | 1.2593 | 0.595 |
1.1464 | 32.0 | 6392 | 1.2563 | 0.585 |
0.9877 | 33.0 | 6591 | 1.1095 | 0.69 |
0.9459 | 34.0 | 6791 | 1.1095 | 0.69 |
1.2336 | 35.0 | 6991 | 1.1095 | 0.69 |
1.1025 | 36.0 | 7191 | 1.1060 | 0.7 |
0.8704 | 37.0 | 7390 | 0.9817 | 0.715 |
0.8831 | 38.0 | 7590 | 0.9817 | 0.715 |
0.9367 | 39.0 | 7790 | 0.9817 | 0.715 |
0.899 | 40.0 | 7990 | 0.9881 | 0.705 |
0.7893 | 41.0 | 8189 | 0.9434 | 0.755 |
0.8003 | 42.0 | 8389 | 0.9434 | 0.755 |
0.715 | 43.0 | 8589 | 0.9434 | 0.755 |
0.8367 | 44.0 | 8789 | 0.9532 | 0.75 |
0.6512 | 45.0 | 8988 | 0.9307 | 0.77 |
0.6109 | 46.0 | 9188 | 0.9307 | 0.77 |
0.6644 | 47.0 | 9388 | 0.9307 | 0.77 |
0.562 | 48.0 | 9588 | 0.9356 | 0.765 |
0.5606 | 49.0 | 9787 | 0.8004 | 0.805 |
0.6194 | 50.0 | 9987 | 0.8004 | 0.805 |
0.5977 | 51.0 | 10187 | 0.8004 | 0.805 |
0.5307 | 52.0 | 10387 | 0.8003 | 0.81 |
0.7827 | 53.0 | 10586 | 0.8727 | 0.765 |
0.5433 | 54.0 | 10786 | 0.8727 | 0.765 |
0.3081 | 55.0 | 10986 | 0.8727 | 0.765 |
0.5239 | 56.0 | 11186 | 0.8528 | 0.765 |
0.284 | 57.0 | 11385 | 0.7730 | 0.8 |
0.2692 | 58.0 | 11585 | 0.7730 | 0.8 |
0.2465 | 59.0 | 11785 | 0.7730 | 0.8 |
0.3708 | 60.0 | 11985 | 0.7825 | 0.795 |
0.1754 | 61.0 | 12184 | 0.8742 | 0.785 |
0.2436 | 62.0 | 12384 | 0.8742 | 0.785 |
0.2889 | 63.0 | 12584 | 0.8742 | 0.785 |
0.2809 | 64.0 | 12784 | 0.8855 | 0.78 |
0.1971 | 65.0 | 12983 | 0.8661 | 0.8 |
0.2701 | 66.0 | 13183 | 0.8661 | 0.8 |
0.1 | 67.0 | 13383 | 0.8661 | 0.8 |
0.1668 | 68.0 | 13583 | 0.8615 | 0.8 |
0.1784 | 69.0 | 13782 | 0.8111 | 0.825 |
0.3054 | 70.0 | 13982 | 0.8111 | 0.825 |
0.0945 | 71.0 | 14182 | 0.8111 | 0.825 |
0.0784 | 72.0 | 14382 | 0.8075 | 0.83 |
0.2458 | 73.0 | 14581 | 0.9653 | 0.81 |
0.1893 | 74.0 | 14781 | 0.9653 | 0.81 |
0.0977 | 75.0 | 14981 | 0.9653 | 0.81 |
0.1417 | 76.0 | 15181 | 0.9600 | 0.81 |
0.0785 | 77.0 | 15380 | 0.8951 | 0.82 |
0.0768 | 78.0 | 15580 | 0.8951 | 0.82 |
0.1524 | 79.0 | 15780 | 0.8951 | 0.82 |
0.0456 | 80.0 | 15980 | 0.8952 | 0.82 |
0.1463 | 81.0 | 16179 | 1.0768 | 0.79 |
0.1568 | 82.0 | 16379 | 1.0768 | 0.79 |
0.0494 | 83.0 | 16579 | 1.0768 | 0.79 |
0.1716 | 84.0 | 16779 | 1.0751 | 0.79 |
0.0351 | 85.0 | 16978 | 0.9556 | 0.825 |
0.0329 | 86.0 | 17178 | 0.9556 | 0.825 |
0.3197 | 87.0 | 17378 | 0.9556 | 0.825 |
0.0317 | 88.0 | 17578 | 0.9561 | 0.825 |
0.0272 | 89.0 | 17777 | 1.0150 | 0.82 |
0.1779 | 90.0 | 17977 | 1.0150 | 0.82 |
0.1416 | 91.0 | 18177 | 1.0150 | 0.82 |
0.2224 | 92.0 | 18377 | 1.0154 | 0.82 |
0.1861 | 93.0 | 18576 | 1.0021 | 0.81 |
0.0288 | 94.0 | 18776 | 1.0021 | 0.81 |
0.1544 | 95.0 | 18976 | 1.0021 | 0.81 |
0.0629 | 96.0 | 19176 | 0.9993 | 0.815 |
0.1571 | 97.0 | 19375 | 0.9871 | 0.815 |
0.139 | 98.0 | 19575 | 0.9871 | 0.815 |
0.028 | 99.0 | 19775 | 0.9871 | 0.815 |
0.102 | 99.62 | 19900 | 0.9871 | 0.815 |
Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3