generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

my_MFCC_VITmodelBB1

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.89 6 0.6992 0.4286
0.6972 1.93 13 0.6895 0.5238
0.6871 2.96 20 0.6864 0.5333
0.6871 4.0 27 0.6861 0.4952
0.6773 4.89 33 0.6688 0.5905
0.6566 5.93 40 0.6430 0.6476
0.6566 6.96 47 0.6324 0.7048
0.645 8.0 54 0.6366 0.6286
0.6301 8.89 60 0.6606 0.6095
0.6301 9.93 67 0.6978 0.5524
0.6506 10.96 74 0.6490 0.5905
0.6829 12.0 81 0.7852 0.4286
0.6829 12.89 87 0.6921 0.5810
0.7127 13.93 94 0.6762 0.5238
0.6481 14.96 101 0.6040 0.7048
0.6481 16.0 108 0.6528 0.6381
0.5894 16.89 114 0.5585 0.7810
0.6039 17.93 121 0.5628 0.7048
0.6039 18.96 128 0.5348 0.7333
0.5583 20.0 135 0.6059 0.6952
0.5176 20.89 141 0.5214 0.7810
0.5176 21.93 148 0.4461 0.7905
0.4893 22.96 155 0.4907 0.7810
0.4869 24.0 162 0.5851 0.7143
0.4869 24.89 168 0.5113 0.7429
0.4905 25.93 175 0.4503 0.7810
0.4197 26.96 182 0.4710 0.7905
0.4197 28.0 189 0.4741 0.7905
0.4006 28.89 195 0.6672 0.6857
0.4254 29.93 202 0.4445 0.8
0.4254 30.96 209 0.4773 0.7905
0.3684 32.0 216 0.6279 0.7238
0.3869 32.89 222 0.5426 0.7524
0.3869 33.93 229 0.5735 0.7143
0.3498 34.96 236 0.4384 0.8095
0.3473 36.0 243 0.3578 0.8476
0.3473 36.89 249 0.4701 0.8381
0.2938 37.93 256 0.4497 0.8
0.295 38.96 263 0.5193 0.8
0.34 40.0 270 0.4324 0.8095
0.34 40.89 276 0.4218 0.8286
0.3022 41.93 283 0.4222 0.8
0.2974 42.96 290 0.4326 0.8095
0.2974 44.0 297 0.5216 0.7524
0.3254 44.89 303 0.4243 0.8
0.2762 45.93 310 0.4856 0.8
0.2762 46.96 317 0.6328 0.7619
0.2528 48.0 324 0.5752 0.7619
0.2852 48.89 330 0.4740 0.7619
0.2852 49.93 337 0.4864 0.7810
0.2873 50.96 344 0.3578 0.8476
0.2691 52.0 351 0.5690 0.7905
0.2691 52.89 357 0.5617 0.7524
0.2544 53.93 364 0.5176 0.7619
0.2246 54.96 371 0.5027 0.7810
0.2246 56.0 378 0.5262 0.8
0.1834 56.89 384 0.6567 0.7143
0.2051 57.93 391 0.4150 0.8286
0.2051 58.96 398 0.4880 0.8
0.1694 60.0 405 0.5683 0.8
0.2246 60.89 411 0.4442 0.8
0.2246 61.93 418 0.4901 0.8
0.2184 62.96 425 0.6485 0.7714
0.1869 64.0 432 0.3877 0.8381
0.1869 64.89 438 0.5256 0.7619
0.1963 65.93 445 0.5285 0.8190
0.1792 66.96 452 0.6391 0.7714
0.1792 68.0 459 0.5738 0.7810
0.1853 68.89 465 0.5518 0.7905
0.1735 69.93 472 0.5239 0.7905
0.1735 70.96 479 0.5718 0.7619
0.2244 72.0 486 0.6423 0.7238
0.1863 72.89 492 0.4858 0.8190
0.1863 73.93 499 0.5777 0.7714
0.1704 74.96 506 0.7484 0.7238
0.1602 76.0 513 0.4144 0.8286
0.1602 76.89 519 0.5055 0.7905
0.213 77.93 526 0.4514 0.8286
0.1649 78.96 533 0.5630 0.7810
0.143 80.0 540 0.4911 0.8
0.143 80.89 546 0.5678 0.8
0.146 81.93 553 0.5183 0.8095
0.1437 82.96 560 0.4870 0.8190
0.1437 84.0 567 0.5785 0.7905
0.1341 84.89 573 0.4781 0.8286
0.1338 85.93 580 0.5996 0.7714
0.1338 86.96 587 0.4562 0.8190
0.151 88.0 594 0.5412 0.8
0.1563 88.89 600 0.5578 0.8
0.1563 89.93 607 0.4887 0.8095
0.1675 90.96 614 0.5019 0.8286
0.143 92.0 621 0.5886 0.8286
0.143 92.89 627 0.6617 0.7714
0.1297 93.93 634 0.5459 0.8
0.1324 94.96 641 0.4964 0.8476
0.1324 96.0 648 0.5943 0.7905
0.1184 96.89 654 0.5569 0.8190
0.1353 97.93 661 0.5658 0.8
0.1353 98.96 668 0.4988 0.8286
0.1453 100.0 675 0.5139 0.8381
0.1199 100.89 681 0.3940 0.8571
0.1199 101.93 688 0.7182 0.7905
0.1146 102.96 695 0.5160 0.8571
0.1443 104.0 702 0.5322 0.8286
0.1443 104.89 708 0.5253 0.7714
0.114 105.93 715 0.4885 0.8286
0.1127 106.96 722 0.4731 0.8286
0.1127 108.0 729 0.5328 0.8286
0.1246 108.89 735 0.4581 0.8286
0.1475 109.93 742 0.4775 0.8476
0.1475 110.96 749 0.5842 0.7905
0.1323 112.0 756 0.5865 0.8286
0.088 112.89 762 0.4749 0.8476
0.088 113.93 769 0.4144 0.8381
0.1012 114.96 776 0.3921 0.8476
0.1363 116.0 783 0.4973 0.8190
0.1363 116.89 789 0.5272 0.7810
0.0992 117.93 796 0.5764 0.8095
0.1008 118.96 803 0.6119 0.8190
0.1315 120.0 810 0.4981 0.8381
0.1315 120.89 816 0.6413 0.7714
0.1161 121.93 823 0.5388 0.8095
0.0904 122.96 830 0.4144 0.8857
0.0904 124.0 837 0.4444 0.8381
0.075 124.89 843 0.3987 0.8667
0.0848 125.93 850 0.5431 0.8476
0.0848 126.96 857 0.6031 0.8095
0.0902 128.0 864 0.3599 0.8476
0.0925 128.89 870 0.5004 0.8381
0.0925 129.93 877 0.4638 0.8476
0.1178 130.96 884 0.4881 0.8667
0.1254 132.0 891 0.6026 0.7905
0.1254 132.89 897 0.4601 0.8476
0.1052 133.33 900 0.6219 0.7905

Framework versions