<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
my_MFCC_VITmodelBB
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.5033
- Accuracy: 0.7905
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.89 | 6 | 0.6914 | 0.5429 |
0.6961 | 1.93 | 13 | 0.7038 | 0.4476 |
0.6856 | 2.96 | 20 | 0.6889 | 0.4762 |
0.6856 | 4.0 | 27 | 0.6870 | 0.5619 |
0.6847 | 4.89 | 33 | 0.6678 | 0.6286 |
0.6677 | 5.93 | 40 | 0.6830 | 0.5714 |
0.6677 | 6.96 | 47 | 0.7347 | 0.5238 |
0.6603 | 8.0 | 54 | 0.6718 | 0.5810 |
0.6351 | 8.89 | 60 | 0.6247 | 0.6952 |
0.6351 | 9.93 | 67 | 0.6285 | 0.6667 |
0.5796 | 10.96 | 74 | 0.6368 | 0.6571 |
0.5964 | 12.0 | 81 | 0.7703 | 0.5429 |
0.5964 | 12.89 | 87 | 0.6178 | 0.6476 |
0.6025 | 13.93 | 94 | 0.6289 | 0.6952 |
0.5447 | 14.96 | 101 | 0.6291 | 0.6095 |
0.5447 | 16.0 | 108 | 0.6182 | 0.6571 |
0.5074 | 16.89 | 114 | 0.5630 | 0.7143 |
0.5535 | 17.93 | 121 | 0.5091 | 0.7429 |
0.5535 | 18.96 | 128 | 0.5557 | 0.7238 |
0.5308 | 20.0 | 135 | 0.5940 | 0.7143 |
0.4703 | 20.89 | 141 | 0.4881 | 0.7619 |
0.4703 | 21.93 | 148 | 0.5166 | 0.7333 |
0.4839 | 22.96 | 155 | 0.5384 | 0.7238 |
0.4693 | 24.0 | 162 | 0.5434 | 0.6762 |
0.4693 | 24.89 | 168 | 0.5765 | 0.7048 |
0.3921 | 25.93 | 175 | 0.5052 | 0.7619 |
0.4024 | 26.96 | 182 | 0.5032 | 0.7429 |
0.4024 | 28.0 | 189 | 0.5031 | 0.7524 |
0.4538 | 28.89 | 195 | 0.5370 | 0.7810 |
0.4034 | 29.93 | 202 | 0.4996 | 0.7238 |
0.4034 | 30.96 | 209 | 0.4727 | 0.7619 |
0.3707 | 32.0 | 216 | 0.6724 | 0.6857 |
0.4529 | 32.89 | 222 | 0.4654 | 0.8286 |
0.4529 | 33.93 | 229 | 0.5904 | 0.7333 |
0.3811 | 34.96 | 236 | 0.4626 | 0.8 |
0.3047 | 36.0 | 243 | 0.4681 | 0.8 |
0.3047 | 36.89 | 249 | 0.5447 | 0.7429 |
0.2965 | 37.93 | 256 | 0.5742 | 0.7619 |
0.3204 | 38.96 | 263 | 0.4925 | 0.8095 |
0.2999 | 40.0 | 270 | 0.4528 | 0.7619 |
0.2999 | 40.89 | 276 | 0.5151 | 0.7905 |
0.2857 | 41.93 | 283 | 0.4967 | 0.7810 |
0.3288 | 42.96 | 290 | 0.4591 | 0.7714 |
0.3288 | 44.0 | 297 | 0.6068 | 0.7429 |
0.2911 | 44.89 | 303 | 0.4261 | 0.8286 |
0.25 | 45.93 | 310 | 0.3688 | 0.8857 |
0.25 | 46.96 | 317 | 0.5787 | 0.7524 |
0.2223 | 48.0 | 324 | 0.4535 | 0.8190 |
0.2646 | 48.89 | 330 | 0.4728 | 0.8286 |
0.2646 | 49.93 | 337 | 0.4388 | 0.8190 |
0.2345 | 50.96 | 344 | 0.4570 | 0.8476 |
0.2049 | 52.0 | 351 | 0.4859 | 0.8095 |
0.2049 | 52.89 | 357 | 0.5517 | 0.7714 |
0.2301 | 53.93 | 364 | 0.5581 | 0.7905 |
0.2333 | 54.96 | 371 | 0.5555 | 0.7714 |
0.2333 | 56.0 | 378 | 0.5128 | 0.7524 |
0.2336 | 56.89 | 384 | 0.5706 | 0.7905 |
0.2267 | 57.93 | 391 | 0.5424 | 0.7905 |
0.2267 | 58.96 | 398 | 0.6782 | 0.7333 |
0.1859 | 60.0 | 405 | 0.5134 | 0.7905 |
0.2234 | 60.89 | 411 | 0.4915 | 0.8286 |
0.2234 | 61.93 | 418 | 0.4518 | 0.8095 |
0.2071 | 62.96 | 425 | 0.5469 | 0.8 |
0.2149 | 64.0 | 432 | 0.5735 | 0.7619 |
0.2149 | 64.89 | 438 | 0.4874 | 0.8 |
0.1873 | 65.93 | 445 | 0.6370 | 0.7143 |
0.1623 | 66.96 | 452 | 0.6216 | 0.7524 |
0.1623 | 68.0 | 459 | 0.6875 | 0.7524 |
0.1815 | 68.89 | 465 | 0.5455 | 0.8 |
0.1798 | 69.93 | 472 | 0.6675 | 0.6762 |
0.1798 | 70.96 | 479 | 0.4702 | 0.8190 |
0.1784 | 72.0 | 486 | 0.5872 | 0.7333 |
0.1352 | 72.89 | 492 | 0.5369 | 0.7905 |
0.1352 | 73.93 | 499 | 0.5192 | 0.8 |
0.2019 | 74.96 | 506 | 0.5167 | 0.7810 |
0.1382 | 76.0 | 513 | 0.5502 | 0.8 |
0.1382 | 76.89 | 519 | 0.5208 | 0.8381 |
0.137 | 77.93 | 526 | 0.5899 | 0.8 |
0.1866 | 78.96 | 533 | 0.4837 | 0.8 |
0.1726 | 80.0 | 540 | 0.6844 | 0.7143 |
0.1726 | 80.89 | 546 | 0.6237 | 0.7905 |
0.1598 | 81.93 | 553 | 0.3875 | 0.8571 |
0.1616 | 82.96 | 560 | 0.4712 | 0.8 |
0.1616 | 84.0 | 567 | 0.6599 | 0.7333 |
0.1659 | 84.89 | 573 | 0.4907 | 0.8 |
0.1392 | 85.93 | 580 | 0.5150 | 0.7714 |
0.1392 | 86.96 | 587 | 0.6279 | 0.7905 |
0.1505 | 88.0 | 594 | 0.6183 | 0.7714 |
0.1373 | 88.89 | 600 | 0.5033 | 0.7905 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3