generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

39-tiny_tobacco3482_kd_CEKD_t2.5_a0.5

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 1.8936 0.11 1.0097 8.5078 0.11 0.0902 0.3251 0.8953
No log 2.0 14 1.2597 0.16 0.8753 5.5353 0.16 0.1308 0.2539 0.7902
No log 3.0 21 1.0230 0.355 0.7833 5.1396 0.3550 0.2872 0.2810 0.4521
No log 4.0 28 0.8743 0.545 0.6497 3.0412 0.545 0.4328 0.3224 0.2775
No log 5.0 35 0.8020 0.625 0.5958 2.6099 0.625 0.5465 0.3186 0.2136
No log 6.0 42 0.7221 0.675 0.5300 2.3085 0.675 0.5632 0.3257 0.1562
No log 7.0 49 0.6964 0.68 0.4843 1.9033 0.68 0.5761 0.3039 0.1453
No log 8.0 56 0.6729 0.72 0.4598 1.8200 0.72 0.6195 0.3089 0.1170
No log 9.0 63 0.6470 0.77 0.4318 1.5607 0.7700 0.7058 0.3518 0.0897
No log 10.0 70 0.5889 0.795 0.4019 1.1238 0.795 0.7546 0.3324 0.0675
No log 11.0 77 0.5829 0.795 0.4013 1.0267 0.795 0.7667 0.3142 0.0728
No log 12.0 84 0.5763 0.785 0.3923 1.2697 0.785 0.7655 0.3286 0.0751
No log 13.0 91 0.5854 0.765 0.3934 1.4915 0.765 0.7291 0.2936 0.0806
No log 14.0 98 0.5779 0.795 0.3983 1.2207 0.795 0.7409 0.3141 0.0681
No log 15.0 105 0.5564 0.795 0.3752 1.1974 0.795 0.7687 0.3201 0.0626
No log 16.0 112 0.5599 0.815 0.3945 1.0987 0.815 0.7827 0.3233 0.0618
No log 17.0 119 0.5748 0.77 0.4001 1.2395 0.7700 0.7497 0.3136 0.0866
No log 18.0 126 0.5611 0.79 0.4028 1.3279 0.79 0.7738 0.3127 0.0680
No log 19.0 133 0.5514 0.805 0.4063 0.8598 0.805 0.7873 0.3656 0.0575
No log 20.0 140 0.5566 0.81 0.4028 0.9944 0.81 0.7943 0.3449 0.0676
No log 21.0 147 0.5489 0.81 0.3879 1.1351 0.81 0.7966 0.3432 0.0682
No log 22.0 154 0.5586 0.82 0.4091 1.1107 0.82 0.7894 0.3526 0.0580
No log 23.0 161 0.5593 0.795 0.4131 1.1693 0.795 0.7765 0.3483 0.0641
No log 24.0 168 0.5493 0.79 0.3962 1.2363 0.79 0.7740 0.3494 0.0646
No log 25.0 175 0.5489 0.8 0.3930 1.0310 0.8000 0.7638 0.3342 0.0614
No log 26.0 182 0.5492 0.79 0.3944 1.3201 0.79 0.7670 0.3096 0.0667
No log 27.0 189 0.5441 0.805 0.4002 1.1304 0.805 0.7886 0.3528 0.0600
No log 28.0 196 0.5397 0.815 0.3960 1.1210 0.815 0.7902 0.3630 0.0544
No log 29.0 203 0.5418 0.785 0.3977 0.9580 0.785 0.7575 0.3536 0.0646
No log 30.0 210 0.5374 0.815 0.3931 1.0186 0.815 0.7855 0.3422 0.0604
No log 31.0 217 0.5405 0.815 0.3983 0.8948 0.815 0.7980 0.3671 0.0531
No log 32.0 224 0.5394 0.805 0.3998 1.0680 0.805 0.7841 0.3695 0.0568
No log 33.0 231 0.5296 0.81 0.3868 1.1222 0.81 0.7891 0.3530 0.0545
No log 34.0 238 0.5338 0.81 0.3952 1.1333 0.81 0.7825 0.3453 0.0559
No log 35.0 245 0.5339 0.805 0.3941 0.8600 0.805 0.7905 0.3552 0.0554
No log 36.0 252 0.5332 0.81 0.3918 0.9018 0.81 0.7996 0.3669 0.0527
No log 37.0 259 0.5336 0.79 0.3907 0.7768 0.79 0.7612 0.3374 0.0611
No log 38.0 266 0.5327 0.805 0.3906 0.9987 0.805 0.7750 0.3430 0.0564
No log 39.0 273 0.5342 0.805 0.3898 1.1024 0.805 0.7837 0.3295 0.0563
No log 40.0 280 0.5310 0.81 0.3906 0.8426 0.81 0.7820 0.3513 0.0556
No log 41.0 287 0.5327 0.81 0.3950 1.0952 0.81 0.7927 0.3418 0.0570
No log 42.0 294 0.5305 0.82 0.3961 0.7830 0.82 0.8011 0.3501 0.0545
No log 43.0 301 0.5308 0.81 0.3926 0.9752 0.81 0.7907 0.3534 0.0573
No log 44.0 308 0.5287 0.81 0.3898 0.9838 0.81 0.7904 0.3454 0.0570
No log 45.0 315 0.5270 0.815 0.3890 0.8682 0.815 0.8004 0.3499 0.0543
No log 46.0 322 0.5272 0.81 0.3884 0.9784 0.81 0.7827 0.3415 0.0541
No log 47.0 329 0.5306 0.805 0.3900 1.1153 0.805 0.7800 0.3388 0.0571
No log 48.0 336 0.5288 0.82 0.3915 0.9916 0.82 0.7912 0.3519 0.0527
No log 49.0 343 0.5274 0.81 0.3886 0.8415 0.81 0.7855 0.3524 0.0550
No log 50.0 350 0.5264 0.81 0.3868 0.9713 0.81 0.7907 0.3408 0.0559
No log 51.0 357 0.5295 0.815 0.3916 1.0340 0.815 0.7933 0.3683 0.0536
No log 52.0 364 0.5294 0.81 0.3920 0.9178 0.81 0.7854 0.3499 0.0563
No log 53.0 371 0.5283 0.81 0.3912 0.8517 0.81 0.7907 0.3648 0.0540
No log 54.0 378 0.5301 0.815 0.3927 0.9279 0.815 0.7933 0.3579 0.0558
No log 55.0 385 0.5275 0.805 0.3888 0.9225 0.805 0.7800 0.3406 0.0553
No log 56.0 392 0.5284 0.815 0.3903 0.9064 0.815 0.7933 0.3463 0.0551
No log 57.0 399 0.5261 0.81 0.3872 0.9072 0.81 0.7907 0.3527 0.0551
No log 58.0 406 0.5278 0.815 0.3900 0.8469 0.815 0.7966 0.3622 0.0526
No log 59.0 413 0.5280 0.81 0.3900 0.9220 0.81 0.7907 0.3467 0.0551
No log 60.0 420 0.5296 0.81 0.3932 0.9166 0.81 0.7907 0.3620 0.0555
No log 61.0 427 0.5288 0.815 0.3925 0.8647 0.815 0.7966 0.3491 0.0529
No log 62.0 434 0.5288 0.81 0.3909 0.9205 0.81 0.7907 0.3482 0.0552
No log 63.0 441 0.5274 0.81 0.3889 0.9143 0.81 0.7907 0.3457 0.0541
No log 64.0 448 0.5283 0.81 0.3905 0.9141 0.81 0.7907 0.3578 0.0549
No log 65.0 455 0.5283 0.81 0.3907 0.9177 0.81 0.7907 0.3536 0.0548
No log 66.0 462 0.5289 0.81 0.3912 0.9179 0.81 0.7907 0.3502 0.0550
No log 67.0 469 0.5282 0.81 0.3903 0.9134 0.81 0.7907 0.3511 0.0547
No log 68.0 476 0.5279 0.81 0.3901 0.9105 0.81 0.7907 0.3473 0.0541
No log 69.0 483 0.5283 0.81 0.3907 0.9128 0.81 0.7907 0.3558 0.0539
No log 70.0 490 0.5283 0.81 0.3904 0.9191 0.81 0.7907 0.3414 0.0543
No log 71.0 497 0.5284 0.81 0.3905 0.9183 0.81 0.7907 0.3478 0.0546
0.3962 72.0 504 0.5285 0.81 0.3909 0.9151 0.81 0.7907 0.3415 0.0545
0.3962 73.0 511 0.5283 0.81 0.3906 0.9144 0.81 0.7907 0.3499 0.0542
0.3962 74.0 518 0.5282 0.81 0.3903 0.9146 0.81 0.7907 0.3411 0.0541
0.3962 75.0 525 0.5284 0.81 0.3909 0.9159 0.81 0.7907 0.3571 0.0542
0.3962 76.0 532 0.5284 0.81 0.3906 0.9155 0.81 0.7907 0.3361 0.0543
0.3962 77.0 539 0.5283 0.81 0.3906 0.9159 0.81 0.7907 0.3480 0.0541
0.3962 78.0 546 0.5282 0.81 0.3905 0.9120 0.81 0.7907 0.3413 0.0540
0.3962 79.0 553 0.5283 0.81 0.3905 0.9162 0.81 0.7907 0.3412 0.0542
0.3962 80.0 560 0.5285 0.81 0.3907 0.9189 0.81 0.7907 0.3361 0.0543
0.3962 81.0 567 0.5285 0.81 0.3907 0.9162 0.81 0.7907 0.3470 0.0541
0.3962 82.0 574 0.5283 0.81 0.3904 0.9144 0.81 0.7907 0.3411 0.0540
0.3962 83.0 581 0.5284 0.81 0.3906 0.9153 0.81 0.7907 0.3361 0.0542
0.3962 84.0 588 0.5284 0.81 0.3907 0.9151 0.81 0.7907 0.3419 0.0542
0.3962 85.0 595 0.5283 0.81 0.3905 0.9143 0.81 0.7907 0.3362 0.0541
0.3962 86.0 602 0.5285 0.81 0.3908 0.9152 0.81 0.7907 0.3418 0.0540
0.3962 87.0 609 0.5284 0.81 0.3907 0.9156 0.81 0.7907 0.3365 0.0543
0.3962 88.0 616 0.5285 0.81 0.3907 0.9155 0.81 0.7907 0.3419 0.0541
0.3962 89.0 623 0.5284 0.81 0.3906 0.9154 0.81 0.7907 0.3360 0.0541
0.3962 90.0 630 0.5285 0.81 0.3907 0.9168 0.81 0.7907 0.3418 0.0543
0.3962 91.0 637 0.5285 0.81 0.3907 0.9160 0.81 0.7907 0.3420 0.0543
0.3962 92.0 644 0.5285 0.81 0.3908 0.9164 0.81 0.7907 0.3421 0.0541
0.3962 93.0 651 0.5285 0.81 0.3907 0.9164 0.81 0.7907 0.3473 0.0542
0.3962 94.0 658 0.5285 0.81 0.3907 0.9164 0.81 0.7907 0.3420 0.0542
0.3962 95.0 665 0.5285 0.81 0.3907 0.9161 0.81 0.7907 0.3473 0.0541
0.3962 96.0 672 0.5285 0.81 0.3907 0.9157 0.81 0.7907 0.3421 0.0542
0.3962 97.0 679 0.5285 0.81 0.3907 0.9154 0.81 0.7907 0.3363 0.0542
0.3962 98.0 686 0.5285 0.81 0.3907 0.9164 0.81 0.7907 0.3420 0.0542
0.3962 99.0 693 0.5285 0.81 0.3907 0.9162 0.81 0.7907 0.3420 0.0542
0.3962 100.0 700 0.5285 0.81 0.3907 0.9159 0.81 0.7907 0.3421 0.0542

Framework versions