generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

300-tiny_tobacco3482_kd_CEKD_t2.5_a0.5

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 13 1.7938 0.235 0.8938 7.8599 0.235 0.1394 0.3127 0.7433
No log 2.0 26 1.2738 0.455 0.6913 3.6965 0.455 0.3679 0.2904 0.3352
No log 3.0 39 1.0682 0.555 0.5748 2.0566 0.555 0.4968 0.2475 0.2296
No log 4.0 52 0.8509 0.655 0.4621 1.7782 0.655 0.6085 0.2242 0.1405
No log 5.0 65 0.7670 0.71 0.4142 1.4993 0.7100 0.6560 0.2271 0.1082
No log 6.0 78 0.7285 0.735 0.3857 1.5730 0.735 0.6874 0.2098 0.0996
No log 7.0 91 0.7052 0.72 0.3804 1.4916 0.72 0.6974 0.2249 0.0959
No log 8.0 104 0.7590 0.71 0.3925 1.8047 0.7100 0.6641 0.1956 0.1008
No log 9.0 117 0.7657 0.71 0.4006 1.8296 0.7100 0.7169 0.2330 0.1025
No log 10.0 130 0.6512 0.755 0.3514 1.5899 0.755 0.7256 0.1863 0.0853
No log 11.0 143 0.6615 0.775 0.3638 1.8180 0.775 0.7564 0.2106 0.0911
No log 12.0 156 0.6195 0.785 0.3398 1.6998 0.785 0.7419 0.2337 0.0643
No log 13.0 169 0.6065 0.78 0.3471 1.5917 0.78 0.7550 0.2280 0.0793
No log 14.0 182 0.6314 0.75 0.3486 1.9235 0.75 0.7315 0.2105 0.0755
No log 15.0 195 0.6426 0.745 0.3686 1.8633 0.745 0.7100 0.2099 0.0891
No log 16.0 208 0.5849 0.765 0.3476 1.3466 0.765 0.7505 0.1978 0.0827
No log 17.0 221 0.5604 0.79 0.3311 1.3948 0.79 0.7581 0.2258 0.0705
No log 18.0 234 0.5504 0.78 0.3230 1.4757 0.78 0.7712 0.2104 0.0624
No log 19.0 247 0.5586 0.785 0.3247 1.5297 0.785 0.7642 0.2159 0.0655
No log 20.0 260 0.5879 0.78 0.3366 1.5348 0.78 0.7727 0.2162 0.0716
No log 21.0 273 0.5558 0.805 0.3113 1.5720 0.805 0.7945 0.2161 0.0652
No log 22.0 286 0.5439 0.795 0.3258 1.7373 0.795 0.7883 0.2307 0.0745
No log 23.0 299 0.5155 0.795 0.3094 1.4183 0.795 0.7725 0.2221 0.0625
No log 24.0 312 0.5039 0.81 0.2994 1.4458 0.81 0.7830 0.2114 0.0624
No log 25.0 325 0.5142 0.81 0.3101 1.2798 0.81 0.7928 0.2205 0.0624
No log 26.0 338 0.5007 0.8 0.3100 1.2390 0.8000 0.7730 0.2038 0.0645
No log 27.0 351 0.4779 0.815 0.2865 1.3312 0.815 0.7863 0.2061 0.0518
No log 28.0 364 0.4893 0.825 0.2927 1.3993 0.825 0.8009 0.2219 0.0555
No log 29.0 377 0.4938 0.82 0.2996 1.4038 0.82 0.7888 0.2138 0.0586
No log 30.0 390 0.4668 0.82 0.2795 1.3366 0.82 0.7944 0.2217 0.0495
No log 31.0 403 0.4662 0.8 0.2805 1.1721 0.8000 0.7761 0.2009 0.0494
No log 32.0 416 0.4787 0.82 0.2887 1.3872 0.82 0.8043 0.2161 0.0542
No log 33.0 429 0.4842 0.81 0.2909 1.4774 0.81 0.7854 0.2246 0.0562
No log 34.0 442 0.4899 0.81 0.2979 1.4419 0.81 0.7843 0.2155 0.0607
No log 35.0 455 0.4832 0.815 0.2920 1.3892 0.815 0.7907 0.2296 0.0552
No log 36.0 468 0.4739 0.815 0.2869 1.2603 0.815 0.7932 0.2385 0.0532
No log 37.0 481 0.4747 0.81 0.2877 1.4390 0.81 0.7848 0.2163 0.0526
No log 38.0 494 0.4710 0.815 0.2842 1.3024 0.815 0.7885 0.2153 0.0516
0.2992 39.0 507 0.4712 0.81 0.2839 1.3676 0.81 0.7860 0.2282 0.0518
0.2992 40.0 520 0.4772 0.815 0.2883 1.3845 0.815 0.7953 0.2216 0.0527
0.2992 41.0 533 0.4751 0.82 0.2877 1.3207 0.82 0.8018 0.2177 0.0521
0.2992 42.0 546 0.4724 0.82 0.2860 1.3075 0.82 0.8018 0.2183 0.0508
0.2992 43.0 559 0.4745 0.82 0.2869 1.3079 0.82 0.8020 0.2184 0.0522
0.2992 44.0 572 0.4779 0.815 0.2884 1.4039 0.815 0.7922 0.2142 0.0531
0.2992 45.0 585 0.4738 0.82 0.2859 1.3153 0.82 0.8018 0.2079 0.0516
0.2992 46.0 598 0.4755 0.815 0.2874 1.3273 0.815 0.7922 0.2279 0.0526
0.2992 47.0 611 0.4736 0.82 0.2858 1.3190 0.82 0.8018 0.2182 0.0515
0.2992 48.0 624 0.4753 0.82 0.2876 1.3170 0.82 0.8018 0.2274 0.0521
0.2992 49.0 637 0.4755 0.82 0.2866 1.4452 0.82 0.8018 0.2245 0.0516
0.2992 50.0 650 0.4754 0.815 0.2869 1.2915 0.815 0.7924 0.2336 0.0523
0.2992 51.0 663 0.4747 0.82 0.2861 1.3336 0.82 0.8020 0.2309 0.0517
0.2992 52.0 676 0.4765 0.815 0.2880 1.3456 0.815 0.7924 0.2137 0.0524
0.2992 53.0 689 0.4756 0.82 0.2866 1.3288 0.82 0.8020 0.2236 0.0518
0.2992 54.0 702 0.4757 0.82 0.2873 1.3860 0.82 0.8018 0.2085 0.0516
0.2992 55.0 715 0.4753 0.815 0.2866 1.3284 0.815 0.7922 0.2100 0.0515
0.2992 56.0 728 0.4759 0.82 0.2870 1.3199 0.82 0.8020 0.2240 0.0518
0.2992 57.0 741 0.4764 0.82 0.2874 1.3901 0.82 0.8020 0.2241 0.0517
0.2992 58.0 754 0.4754 0.815 0.2870 1.3246 0.815 0.7924 0.2260 0.0520
0.2992 59.0 767 0.4759 0.815 0.2870 1.3862 0.815 0.7924 0.2176 0.0520
0.2992 60.0 780 0.4765 0.815 0.2874 1.3873 0.815 0.7924 0.2266 0.0523
0.2992 61.0 793 0.4763 0.82 0.2873 1.3851 0.82 0.8020 0.2161 0.0517
0.2992 62.0 806 0.4768 0.815 0.2878 1.3903 0.815 0.7924 0.2128 0.0522
0.2992 63.0 819 0.4767 0.82 0.2876 1.3866 0.82 0.8020 0.2120 0.0521
0.2992 64.0 832 0.4762 0.82 0.2872 1.3910 0.82 0.8020 0.2157 0.0516
0.2992 65.0 845 0.4765 0.82 0.2874 1.3892 0.82 0.8020 0.2178 0.0519
0.2992 66.0 858 0.4767 0.82 0.2875 1.3462 0.82 0.8020 0.2180 0.0519
0.2992 67.0 871 0.4764 0.82 0.2872 1.3894 0.82 0.8020 0.2252 0.0518
0.2992 68.0 884 0.4767 0.82 0.2874 1.3860 0.82 0.8020 0.2118 0.0518
0.2992 69.0 897 0.4766 0.82 0.2874 1.3894 0.82 0.8020 0.2180 0.0519
0.2992 70.0 910 0.4765 0.82 0.2872 1.3882 0.82 0.8020 0.2280 0.0517
0.2992 71.0 923 0.4766 0.82 0.2874 1.3875 0.82 0.8020 0.2177 0.0519
0.2992 72.0 936 0.4765 0.82 0.2874 1.3880 0.82 0.8020 0.2148 0.0517
0.2992 73.0 949 0.4766 0.82 0.2873 1.3915 0.82 0.8020 0.2109 0.0516
0.2992 74.0 962 0.4765 0.82 0.2872 1.3900 0.82 0.8020 0.2110 0.0517
0.2992 75.0 975 0.4769 0.82 0.2875 1.3913 0.82 0.8020 0.2251 0.0520
0.2992 76.0 988 0.4770 0.82 0.2876 1.3909 0.82 0.8020 0.2196 0.0520
0.0695 77.0 1001 0.4768 0.82 0.2875 1.3890 0.82 0.8020 0.2212 0.0517
0.0695 78.0 1014 0.4767 0.82 0.2873 1.3935 0.82 0.8020 0.2281 0.0518
0.0695 79.0 1027 0.4767 0.82 0.2874 1.3897 0.82 0.8020 0.2282 0.0517
0.0695 80.0 1040 0.4770 0.82 0.2876 1.3889 0.82 0.8020 0.2174 0.0518
0.0695 81.0 1053 0.4770 0.82 0.2875 1.3935 0.82 0.8020 0.2221 0.0518
0.0695 82.0 1066 0.4766 0.82 0.2873 1.3901 0.82 0.8020 0.2283 0.0517
0.0695 83.0 1079 0.4768 0.82 0.2874 1.3902 0.82 0.8020 0.2283 0.0517
0.0695 84.0 1092 0.4770 0.82 0.2874 1.3917 0.82 0.8020 0.2217 0.0518
0.0695 85.0 1105 0.4769 0.82 0.2875 1.3913 0.82 0.8020 0.2283 0.0518
0.0695 86.0 1118 0.4769 0.82 0.2874 1.3916 0.82 0.8020 0.2282 0.0517
0.0695 87.0 1131 0.4769 0.82 0.2874 1.3912 0.82 0.8020 0.2218 0.0517
0.0695 88.0 1144 0.4770 0.82 0.2875 1.3923 0.82 0.8020 0.2218 0.0517
0.0695 89.0 1157 0.4768 0.82 0.2874 1.3905 0.82 0.8020 0.2283 0.0518
0.0695 90.0 1170 0.4769 0.82 0.2875 1.3924 0.82 0.8020 0.2219 0.0517
0.0695 91.0 1183 0.4769 0.82 0.2874 1.3923 0.82 0.8020 0.2219 0.0517
0.0695 92.0 1196 0.4768 0.82 0.2874 1.3908 0.82 0.8020 0.2219 0.0517
0.0695 93.0 1209 0.4770 0.82 0.2875 1.3909 0.82 0.8020 0.2219 0.0518
0.0695 94.0 1222 0.4768 0.82 0.2873 1.3918 0.82 0.8020 0.2219 0.0517
0.0695 95.0 1235 0.4769 0.82 0.2874 1.3914 0.82 0.8020 0.2219 0.0517
0.0695 96.0 1248 0.4770 0.82 0.2875 1.3917 0.82 0.8020 0.2219 0.0517
0.0695 97.0 1261 0.4769 0.82 0.2874 1.3918 0.82 0.8020 0.2219 0.0517
0.0695 98.0 1274 0.4770 0.82 0.2875 1.3920 0.82 0.8020 0.2219 0.0517
0.0695 99.0 1287 0.4770 0.82 0.2875 1.3922 0.82 0.8020 0.2219 0.0517
0.0695 100.0 1300 0.4770 0.82 0.2875 1.3922 0.82 0.8020 0.2219 0.0517

Framework versions