generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

171-tiny_tobacco3482_kd_CEKD_t2.5_a0.5

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 13 1.6008 0.23 0.8921 8.0367 0.23 0.1380 0.3153 0.7486
No log 2.0 26 1.1383 0.445 0.6997 3.6320 0.445 0.3583 0.2866 0.3390
No log 3.0 39 0.9781 0.555 0.5896 2.1989 0.555 0.4763 0.2856 0.2440
No log 4.0 52 0.7953 0.65 0.4796 1.7904 0.65 0.5880 0.2308 0.1417
No log 5.0 65 0.7282 0.705 0.4370 1.4923 0.705 0.6654 0.2538 0.1123
No log 6.0 78 0.6794 0.73 0.3987 1.5706 0.7300 0.6928 0.2386 0.1041
No log 7.0 91 0.6813 0.73 0.4024 1.6519 0.7300 0.6984 0.2553 0.1027
No log 8.0 104 0.6669 0.72 0.3910 1.6057 0.72 0.6811 0.2234 0.0990
No log 9.0 117 0.7152 0.72 0.4167 1.9716 0.72 0.7201 0.2259 0.1091
No log 10.0 130 0.6722 0.745 0.3751 1.9561 0.745 0.7290 0.2362 0.0849
No log 11.0 143 0.6263 0.75 0.3817 1.8594 0.75 0.7238 0.2511 0.0980
No log 12.0 156 0.6259 0.725 0.3946 1.8363 0.7250 0.6835 0.2186 0.0974
No log 13.0 169 0.5756 0.77 0.3487 1.3847 0.7700 0.7171 0.2271 0.0723
No log 14.0 182 0.5670 0.76 0.3492 1.7986 0.76 0.7323 0.2201 0.0713
No log 15.0 195 0.5538 0.785 0.3532 1.6319 0.785 0.7608 0.2479 0.0629
No log 16.0 208 0.5634 0.75 0.3582 1.5131 0.75 0.7397 0.2333 0.0747
No log 17.0 221 0.5348 0.77 0.3378 1.5843 0.7700 0.7421 0.2193 0.0646
No log 18.0 234 0.5306 0.78 0.3310 1.6298 0.78 0.7618 0.2290 0.0644
No log 19.0 247 0.5185 0.805 0.3400 1.4945 0.805 0.7755 0.2627 0.0622
No log 20.0 260 0.5335 0.76 0.3402 1.5758 0.76 0.7108 0.2372 0.0699
No log 21.0 273 0.5191 0.76 0.3389 1.3860 0.76 0.7413 0.2587 0.0661
No log 22.0 286 0.5198 0.785 0.3423 1.4790 0.785 0.7607 0.2513 0.0649
No log 23.0 299 0.5155 0.79 0.3344 1.5003 0.79 0.7648 0.2393 0.0671
No log 24.0 312 0.5156 0.775 0.3380 1.5898 0.775 0.7388 0.2295 0.0667
No log 25.0 325 0.4808 0.815 0.3033 1.4602 0.815 0.7837 0.2520 0.0520
No log 26.0 338 0.4975 0.785 0.3325 1.3864 0.785 0.7563 0.2298 0.0673
No log 27.0 351 0.4988 0.785 0.3257 1.5206 0.785 0.7717 0.2156 0.0638
No log 28.0 364 0.4928 0.795 0.3209 1.3717 0.795 0.7719 0.2303 0.0612
No log 29.0 377 0.4660 0.81 0.3022 1.2190 0.81 0.7864 0.2285 0.0485
No log 30.0 390 0.4777 0.815 0.3123 1.4266 0.815 0.7926 0.2535 0.0562
No log 31.0 403 0.4695 0.82 0.3067 1.3425 0.82 0.8000 0.2338 0.0528
No log 32.0 416 0.4701 0.815 0.3026 1.3247 0.815 0.7893 0.2259 0.0522
No log 33.0 429 0.4625 0.82 0.3023 1.2646 0.82 0.7915 0.2441 0.0486
No log 34.0 442 0.4684 0.81 0.3080 1.3468 0.81 0.7846 0.2373 0.0521
No log 35.0 455 0.4629 0.81 0.3000 1.3441 0.81 0.7869 0.2375 0.0492
No log 36.0 468 0.4680 0.81 0.3074 1.2158 0.81 0.7894 0.2417 0.0508
No log 37.0 481 0.4672 0.81 0.3053 1.3329 0.81 0.7866 0.2320 0.0508
No log 38.0 494 0.4716 0.805 0.3091 1.2975 0.805 0.7863 0.2361 0.0545
0.3111 39.0 507 0.4703 0.805 0.3081 1.2855 0.805 0.7863 0.2473 0.0534
0.3111 40.0 520 0.4692 0.81 0.3073 1.2833 0.81 0.7894 0.2361 0.0525
0.3111 41.0 533 0.4681 0.81 0.3068 1.2804 0.81 0.7890 0.2386 0.0517
0.3111 42.0 546 0.4672 0.81 0.3058 1.4597 0.81 0.7898 0.2276 0.0521
0.3111 43.0 559 0.4691 0.81 0.3080 1.4136 0.81 0.7894 0.2280 0.0520
0.3111 44.0 572 0.4664 0.815 0.3048 1.4593 0.815 0.7921 0.2459 0.0509
0.3111 45.0 585 0.4684 0.81 0.3069 1.4071 0.81 0.7894 0.2415 0.0514
0.3111 46.0 598 0.4688 0.81 0.3066 1.4084 0.81 0.7890 0.2174 0.0516
0.3111 47.0 611 0.4683 0.81 0.3061 1.4052 0.81 0.7890 0.2406 0.0515
0.3111 48.0 624 0.4677 0.81 0.3065 1.4045 0.81 0.7890 0.2346 0.0508
0.3111 49.0 637 0.4679 0.81 0.3058 1.4072 0.81 0.7890 0.2177 0.0507
0.3111 50.0 650 0.4679 0.81 0.3061 1.4681 0.81 0.7890 0.2619 0.0510
0.3111 51.0 663 0.4688 0.81 0.3068 1.4662 0.81 0.7890 0.2325 0.0513
0.3111 52.0 676 0.4679 0.81 0.3063 1.4062 0.81 0.7890 0.2257 0.0508
0.3111 53.0 689 0.4682 0.81 0.3064 1.4667 0.81 0.7890 0.2279 0.0512
0.3111 54.0 702 0.4674 0.81 0.3058 1.4075 0.81 0.7890 0.2269 0.0507
0.3111 55.0 715 0.4689 0.81 0.3069 1.4674 0.81 0.7890 0.2428 0.0511
0.3111 56.0 728 0.4678 0.81 0.3062 1.4081 0.81 0.7890 0.2402 0.0507
0.3111 57.0 741 0.4691 0.81 0.3069 1.4691 0.81 0.7890 0.2279 0.0511
0.3111 58.0 754 0.4686 0.81 0.3067 1.4114 0.81 0.7890 0.2647 0.0510
0.3111 59.0 767 0.4688 0.81 0.3069 1.4130 0.81 0.7890 0.2416 0.0510
0.3111 60.0 780 0.4685 0.81 0.3065 1.4206 0.81 0.7890 0.2278 0.0509
0.3111 61.0 793 0.4688 0.81 0.3069 1.4145 0.81 0.7890 0.2307 0.0513
0.3111 62.0 806 0.4690 0.81 0.3070 1.4681 0.81 0.7890 0.2437 0.0510
0.3111 63.0 819 0.4688 0.81 0.3068 1.4680 0.81 0.7890 0.2465 0.0510
0.3111 64.0 832 0.4681 0.81 0.3062 1.4670 0.81 0.7890 0.2565 0.0507
0.3111 65.0 845 0.4690 0.81 0.3069 1.4675 0.81 0.7890 0.2444 0.0510
0.3111 66.0 858 0.4688 0.81 0.3069 1.4673 0.81 0.7890 0.2433 0.0510
0.3111 67.0 871 0.4686 0.81 0.3066 1.4676 0.81 0.7890 0.2560 0.0507
0.3111 68.0 884 0.4684 0.81 0.3064 1.4667 0.81 0.7890 0.2496 0.0506
0.3111 69.0 897 0.4686 0.81 0.3066 1.4675 0.81 0.7890 0.2407 0.0507
0.3111 70.0 910 0.4689 0.81 0.3068 1.4679 0.81 0.7890 0.2502 0.0508
0.3111 71.0 923 0.4690 0.81 0.3071 1.4687 0.81 0.7890 0.2445 0.0507
0.3111 72.0 936 0.4688 0.81 0.3068 1.4678 0.81 0.7890 0.2500 0.0506
0.3111 73.0 949 0.4689 0.81 0.3068 1.4685 0.81 0.7890 0.2662 0.0510
0.3111 74.0 962 0.4687 0.81 0.3067 1.4679 0.81 0.7890 0.2496 0.0507
0.3111 75.0 975 0.4688 0.81 0.3067 1.4683 0.81 0.7890 0.2468 0.0508
0.3111 76.0 988 0.4688 0.81 0.3067 1.4676 0.81 0.7890 0.2511 0.0508
0.1126 77.0 1001 0.4689 0.81 0.3068 1.4672 0.81 0.7890 0.2365 0.0506
0.1126 78.0 1014 0.4688 0.81 0.3066 1.4681 0.81 0.7890 0.2507 0.0507
0.1126 79.0 1027 0.4688 0.81 0.3068 1.4680 0.81 0.7890 0.2498 0.0508
0.1126 80.0 1040 0.4689 0.81 0.3068 1.4676 0.81 0.7890 0.2497 0.0507
0.1126 81.0 1053 0.4690 0.81 0.3068 1.4682 0.81 0.7890 0.2338 0.0506
0.1126 82.0 1066 0.4686 0.81 0.3065 1.4682 0.81 0.7890 0.2541 0.0505
0.1126 83.0 1079 0.4689 0.815 0.3067 1.4675 0.815 0.7970 0.2503 0.0501
0.1126 84.0 1092 0.4687 0.815 0.3065 1.4676 0.815 0.7970 0.2567 0.0501
0.1126 85.0 1105 0.4689 0.81 0.3067 1.4680 0.81 0.7890 0.2678 0.0507
0.1126 86.0 1118 0.4689 0.815 0.3067 1.4684 0.815 0.7970 0.2566 0.0502
0.1126 87.0 1131 0.4687 0.815 0.3066 1.4672 0.815 0.7970 0.2529 0.0501
0.1126 88.0 1144 0.4689 0.815 0.3067 1.4680 0.815 0.7970 0.2569 0.0502
0.1126 89.0 1157 0.4688 0.815 0.3067 1.4678 0.815 0.7970 0.2527 0.0500
0.1126 90.0 1170 0.4689 0.815 0.3067 1.4681 0.815 0.7970 0.2527 0.0501
0.1126 91.0 1183 0.4688 0.815 0.3067 1.4683 0.815 0.7970 0.2527 0.0500
0.1126 92.0 1196 0.4688 0.815 0.3066 1.4675 0.815 0.7970 0.2528 0.0500
0.1126 93.0 1209 0.4689 0.815 0.3068 1.4680 0.815 0.7970 0.2527 0.0500
0.1126 94.0 1222 0.4688 0.815 0.3066 1.4678 0.815 0.7970 0.2440 0.0499
0.1126 95.0 1235 0.4688 0.815 0.3066 1.4677 0.815 0.7970 0.2440 0.0499
0.1126 96.0 1248 0.4688 0.815 0.3067 1.4681 0.815 0.7970 0.2528 0.0500
0.1126 97.0 1261 0.4688 0.815 0.3066 1.4679 0.815 0.7970 0.2440 0.0500
0.1126 98.0 1274 0.4689 0.815 0.3067 1.4680 0.815 0.7970 0.2440 0.0500
0.1126 99.0 1287 0.4689 0.815 0.3067 1.4679 0.815 0.7970 0.2440 0.0500
0.1126 100.0 1300 0.4688 0.815 0.3067 1.4679 0.815 0.7970 0.2440 0.0500

Framework versions