generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

vit-small_tobacco3482_kd_CEKD_t5.0_a0.5

This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 1.5608 0.225 0.8774 5.2159 0.225 0.1397 0.2725 0.7038
No log 2.0 14 1.2539 0.415 0.7531 3.2673 0.415 0.2434 0.3070 0.4078
No log 3.0 21 0.9055 0.585 0.5971 1.9093 0.585 0.5086 0.3232 0.2172
No log 4.0 28 0.7122 0.72 0.4403 1.7693 0.72 0.6805 0.3073 0.1228
No log 5.0 35 0.6584 0.74 0.3938 1.5810 0.74 0.7214 0.2661 0.1075
No log 6.0 42 0.5711 0.8 0.3462 1.4146 0.8000 0.7524 0.2347 0.0843
No log 7.0 49 0.5521 0.8 0.3199 1.2631 0.8000 0.7867 0.2542 0.0634
No log 8.0 56 0.5603 0.77 0.3381 1.1808 0.7700 0.7680 0.2316 0.0858
No log 9.0 63 0.5209 0.82 0.3062 1.2891 0.82 0.7972 0.2405 0.0792
No log 10.0 70 0.5705 0.78 0.3343 1.5183 0.78 0.7743 0.2264 0.0874
No log 11.0 77 0.5137 0.82 0.3047 1.2987 0.82 0.8096 0.2420 0.0592
No log 12.0 84 0.4664 0.835 0.2929 1.1529 0.835 0.8101 0.2291 0.0753
No log 13.0 91 0.4772 0.82 0.2915 1.2078 0.82 0.8029 0.2131 0.0620
No log 14.0 98 0.4553 0.825 0.2843 1.1312 0.825 0.8112 0.2196 0.0453
No log 15.0 105 0.4574 0.825 0.2821 1.1234 0.825 0.8163 0.2241 0.0554
No log 16.0 112 0.4873 0.8 0.3111 1.2248 0.8000 0.8007 0.1992 0.0657
No log 17.0 119 0.4224 0.855 0.2620 1.1871 0.855 0.8218 0.2337 0.0479
No log 18.0 126 0.4414 0.825 0.2857 1.0723 0.825 0.8227 0.2500 0.0517
No log 19.0 133 0.4232 0.845 0.2737 0.9360 0.845 0.8219 0.2053 0.0543
No log 20.0 140 0.4114 0.845 0.2637 1.0046 0.845 0.8233 0.2144 0.0460
No log 21.0 147 0.4110 0.835 0.2640 0.9853 0.835 0.8160 0.2278 0.0466
No log 22.0 154 0.4163 0.845 0.2678 1.1494 0.845 0.8291 0.2156 0.0458
No log 23.0 161 0.4243 0.835 0.2779 0.9475 0.835 0.8269 0.2420 0.0554
No log 24.0 168 0.4079 0.835 0.2683 0.9249 0.835 0.8044 0.2091 0.0532
No log 25.0 175 0.4027 0.85 0.2621 0.9433 0.85 0.8361 0.2138 0.0530
No log 26.0 182 0.3975 0.855 0.2590 0.9310 0.855 0.8457 0.1932 0.0487
No log 27.0 189 0.4032 0.85 0.2650 0.9823 0.85 0.8425 0.2088 0.0528
No log 28.0 196 0.4037 0.845 0.2650 1.0692 0.845 0.8361 0.2157 0.0496
No log 29.0 203 0.4027 0.845 0.2652 1.0423 0.845 0.8295 0.1917 0.0502
No log 30.0 210 0.3989 0.85 0.2610 1.0633 0.85 0.8392 0.2214 0.0482
No log 31.0 217 0.3985 0.855 0.2609 1.0374 0.855 0.8424 0.2074 0.0472
No log 32.0 224 0.3986 0.85 0.2596 1.0403 0.85 0.8392 0.2184 0.0462
No log 33.0 231 0.3990 0.85 0.2603 1.0369 0.85 0.8392 0.2079 0.0470
No log 34.0 238 0.3982 0.85 0.2600 0.9765 0.85 0.8392 0.2160 0.0467
No log 35.0 245 0.3977 0.85 0.2601 0.9762 0.85 0.8392 0.2108 0.0465
No log 36.0 252 0.3977 0.85 0.2600 1.0372 0.85 0.8392 0.2075 0.0466
No log 37.0 259 0.3972 0.85 0.2597 1.0383 0.85 0.8392 0.2091 0.0465
No log 38.0 266 0.3967 0.85 0.2590 0.9796 0.85 0.8392 0.1987 0.0461
No log 39.0 273 0.3979 0.85 0.2601 1.0390 0.85 0.8392 0.1991 0.0467
No log 40.0 280 0.3976 0.85 0.2601 0.9775 0.85 0.8392 0.2175 0.0465
No log 41.0 287 0.3979 0.85 0.2603 0.9796 0.85 0.8392 0.1930 0.0467
No log 42.0 294 0.3973 0.85 0.2598 0.9746 0.85 0.8392 0.2175 0.0468
No log 43.0 301 0.3972 0.85 0.2598 0.9798 0.85 0.8392 0.1931 0.0466
No log 44.0 308 0.3969 0.85 0.2594 0.9784 0.85 0.8392 0.2094 0.0465
No log 45.0 315 0.3971 0.85 0.2596 0.9847 0.85 0.8392 0.2033 0.0464
No log 46.0 322 0.3969 0.85 0.2597 0.9768 0.85 0.8392 0.2100 0.0465
No log 47.0 329 0.3974 0.85 0.2599 0.9788 0.85 0.8392 0.2090 0.0467
No log 48.0 336 0.3971 0.85 0.2596 0.9797 0.85 0.8392 0.1977 0.0463
No log 49.0 343 0.3972 0.85 0.2597 0.9391 0.85 0.8392 0.1903 0.0465
No log 50.0 350 0.3969 0.85 0.2596 0.9802 0.85 0.8392 0.1985 0.0464
No log 51.0 357 0.3970 0.85 0.2596 0.9795 0.85 0.8392 0.2161 0.0463
No log 52.0 364 0.3973 0.85 0.2597 0.9333 0.85 0.8392 0.1983 0.0462
No log 53.0 371 0.3971 0.85 0.2597 0.9408 0.85 0.8392 0.2022 0.0467
No log 54.0 378 0.3970 0.85 0.2595 0.9371 0.85 0.8392 0.1992 0.0460
No log 55.0 385 0.3970 0.85 0.2596 0.9262 0.85 0.8392 0.1917 0.0464
No log 56.0 392 0.3971 0.85 0.2595 0.9195 0.85 0.8392 0.1927 0.0461
No log 57.0 399 0.3970 0.85 0.2596 0.9789 0.85 0.8392 0.1992 0.0462
No log 58.0 406 0.3968 0.85 0.2594 0.9255 0.85 0.8392 0.1929 0.0462
No log 59.0 413 0.3967 0.85 0.2593 0.9795 0.85 0.8392 0.1996 0.0459
No log 60.0 420 0.3970 0.85 0.2596 0.9787 0.85 0.8392 0.1994 0.0461
No log 61.0 427 0.3967 0.85 0.2594 0.9803 0.85 0.8392 0.2073 0.0461
No log 62.0 434 0.3968 0.85 0.2594 0.9325 0.85 0.8392 0.1996 0.0460
No log 63.0 441 0.3968 0.85 0.2595 0.9276 0.85 0.8392 0.2063 0.0459
No log 64.0 448 0.3968 0.85 0.2595 0.9247 0.85 0.8392 0.1991 0.0461
No log 65.0 455 0.3968 0.85 0.2595 0.9301 0.85 0.8392 0.1989 0.0459
No log 66.0 462 0.3968 0.85 0.2595 0.9310 0.85 0.8392 0.1922 0.0459
No log 67.0 469 0.3968 0.85 0.2595 0.9250 0.85 0.8392 0.2061 0.0459
No log 68.0 476 0.3968 0.85 0.2594 0.9234 0.85 0.8392 0.1994 0.0461
No log 69.0 483 0.3967 0.85 0.2594 0.9257 0.85 0.8392 0.2065 0.0459
No log 70.0 490 0.3967 0.85 0.2594 0.9205 0.85 0.8392 0.1840 0.0459
No log 71.0 497 0.3967 0.85 0.2594 0.9258 0.85 0.8392 0.2017 0.0458
0.1666 72.0 504 0.3969 0.85 0.2594 0.9297 0.85 0.8392 0.2017 0.0458
0.1666 73.0 511 0.3966 0.85 0.2593 0.9223 0.85 0.8392 0.1920 0.0457
0.1666 74.0 518 0.3967 0.85 0.2594 0.9228 0.85 0.8392 0.1920 0.0459
0.1666 75.0 525 0.3967 0.85 0.2594 0.9257 0.85 0.8392 0.1919 0.0459
0.1666 76.0 532 0.3966 0.85 0.2593 0.9232 0.85 0.8392 0.1994 0.0458
0.1666 77.0 539 0.3968 0.85 0.2594 0.9224 0.85 0.8392 0.1920 0.0459
0.1666 78.0 546 0.3966 0.85 0.2593 0.9242 0.85 0.8392 0.1918 0.0458
0.1666 79.0 553 0.3967 0.85 0.2594 0.9233 0.85 0.8392 0.1920 0.0459
0.1666 80.0 560 0.3968 0.85 0.2594 0.9241 0.85 0.8392 0.1919 0.0458
0.1666 81.0 567 0.3967 0.85 0.2594 0.9225 0.85 0.8392 0.1918 0.0459
0.1666 82.0 574 0.3967 0.85 0.2594 0.9233 0.85 0.8392 0.1919 0.0459
0.1666 83.0 581 0.3967 0.85 0.2593 0.9246 0.85 0.8392 0.1919 0.0458
0.1666 84.0 588 0.3966 0.85 0.2593 0.9229 0.85 0.8392 0.2017 0.0458
0.1666 85.0 595 0.3966 0.85 0.2593 0.9232 0.85 0.8392 0.2017 0.0458
0.1666 86.0 602 0.3967 0.85 0.2593 0.9225 0.85 0.8392 0.1920 0.0458
0.1666 87.0 609 0.3966 0.85 0.2593 0.9214 0.85 0.8392 0.1999 0.0458
0.1666 88.0 616 0.3967 0.85 0.2593 0.9214 0.85 0.8392 0.1920 0.0458
0.1666 89.0 623 0.3966 0.85 0.2593 0.9227 0.85 0.8392 0.2097 0.0458
0.1666 90.0 630 0.3967 0.85 0.2594 0.9219 0.85 0.8392 0.1919 0.0458
0.1666 91.0 637 0.3966 0.85 0.2593 0.9212 0.85 0.8392 0.1994 0.0458
0.1666 92.0 644 0.3966 0.85 0.2593 0.9227 0.85 0.8392 0.1919 0.0458
0.1666 93.0 651 0.3966 0.85 0.2593 0.9231 0.85 0.8392 0.2017 0.0458
0.1666 94.0 658 0.3967 0.85 0.2593 0.9220 0.85 0.8392 0.1919 0.0458
0.1666 95.0 665 0.3966 0.85 0.2593 0.9217 0.85 0.8392 0.1920 0.0457
0.1666 96.0 672 0.3966 0.85 0.2593 0.9218 0.85 0.8392 0.1920 0.0458
0.1666 97.0 679 0.3966 0.85 0.2593 0.9221 0.85 0.8392 0.1920 0.0458
0.1666 98.0 686 0.3966 0.85 0.2593 0.9224 0.85 0.8392 0.1920 0.0457
0.1666 99.0 693 0.3966 0.85 0.2593 0.9224 0.85 0.8392 0.1994 0.0457
0.1666 100.0 700 0.3966 0.85 0.2593 0.9223 0.85 0.8392 0.1994 0.0457

Framework versions