generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

my_MFCC_VITmodelBIT1

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6858 0.95 10 0.6256 0.8
0.5939 2.0 21 0.5227 0.8
0.561 2.95 31 0.4967 0.8
0.5175 4.0 42 0.4687 0.8
0.5334 4.95 52 0.4554 0.8
0.5074 6.0 63 0.4505 0.8
0.4852 6.95 73 0.4422 0.8
0.4711 8.0 84 0.4033 0.8
0.4636 8.95 94 0.4193 0.8242
0.5 10.0 105 0.3682 0.8485
0.4255 10.95 115 0.4124 0.7879
0.4258 12.0 126 0.4144 0.8364
0.4542 12.95 136 0.3729 0.8303
0.3631 14.0 147 0.4177 0.8303
0.4919 14.95 157 0.3634 0.8303
0.405 16.0 168 0.3081 0.8970
0.3908 16.95 178 0.3965 0.8424
0.4064 18.0 189 0.3502 0.8364
0.345 18.95 199 0.3427 0.8303
0.363 20.0 210 0.2901 0.8909
0.3278 20.95 220 0.3289 0.8667
0.3074 22.0 231 0.3593 0.8121
0.3469 22.95 241 0.2968 0.8727
0.3545 24.0 252 0.4895 0.7394
0.3457 24.95 262 0.3278 0.8788
0.339 26.0 273 0.3363 0.8424
0.3023 26.95 283 0.3420 0.8667
0.3462 28.0 294 0.3377 0.8364
0.2999 28.95 304 0.3599 0.8606
0.2713 30.0 315 0.3054 0.8727
0.2805 30.95 325 0.3414 0.8424
0.294 32.0 336 0.2949 0.8788
0.2884 32.95 346 0.2989 0.8545
0.2936 34.0 357 0.3898 0.8424
0.3077 34.95 367 0.3450 0.8545
0.3316 36.0 378 0.2584 0.9152
0.2769 36.95 388 0.2774 0.8788
0.2555 38.0 399 0.3349 0.8303
0.2512 38.95 409 0.3747 0.8545
0.2707 40.0 420 0.3558 0.8303
0.2638 40.95 430 0.3931 0.7939
0.2746 42.0 441 0.3997 0.8242
0.307 42.95 451 0.3194 0.8485
0.2269 44.0 462 0.4378 0.8182
0.2142 44.95 472 0.3499 0.8424
0.2102 46.0 483 0.3766 0.8303
0.247 46.95 493 0.3521 0.8242
0.2347 48.0 504 0.3583 0.8667
0.2081 48.95 514 0.3162 0.8545
0.2371 50.0 525 0.3307 0.8727
0.2298 50.95 535 0.2449 0.9152
0.235 52.0 546 0.3831 0.8545
0.1972 52.95 556 0.3087 0.8424
0.1993 54.0 567 0.2912 0.8848
0.2183 54.95 577 0.3253 0.8545
0.2222 56.0 588 0.3338 0.8727
0.1984 56.95 598 0.3510 0.8364
0.174 58.0 609 0.3521 0.8667
0.2194 58.95 619 0.2718 0.8667
0.1734 60.0 630 0.3758 0.8667
0.1841 60.95 640 0.3342 0.8727
0.1747 62.0 651 0.3858 0.8485
0.2196 62.95 661 0.4457 0.8121
0.1899 64.0 672 0.3924 0.8545
0.2504 64.95 682 0.3071 0.8667
0.2099 66.0 693 0.4383 0.7879
0.1707 66.95 703 0.3140 0.8788
0.2126 68.0 714 0.3500 0.8667
0.1703 68.95 724 0.3411 0.8606
0.1602 70.0 735 0.3394 0.8606
0.1404 70.95 745 0.3308 0.8727
0.156 72.0 756 0.3535 0.8606
0.1305 72.95 766 0.3296 0.8606
0.1516 74.0 777 0.3859 0.8485
0.1536 74.95 787 0.3857 0.8545
0.1434 76.0 798 0.3344 0.8667
0.1499 76.95 808 0.2926 0.8788
0.1623 78.0 819 0.3481 0.8606
0.146 78.95 829 0.3499 0.8727
0.1457 80.0 840 0.3536 0.8909
0.1779 80.95 850 0.3358 0.8848
0.153 82.0 861 0.4687 0.8242
0.1558 82.95 871 0.3269 0.8606
0.1594 84.0 882 0.4053 0.8545
0.1455 84.95 892 0.3744 0.8545
0.1409 86.0 903 0.2758 0.8788
0.1364 86.95 913 0.3159 0.8788
0.1233 88.0 924 0.3728 0.8606
0.1266 88.95 934 0.4164 0.8424
0.1239 90.0 945 0.3519 0.8848
0.1617 90.95 955 0.2978 0.8848
0.1487 92.0 966 0.2711 0.8970
0.1045 92.95 976 0.3045 0.8788
0.1319 94.0 987 0.3578 0.8667
0.1349 94.95 997 0.2984 0.8848
0.1053 95.24 1000 0.3329 0.8909

Framework versions