generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

171-tiny_tobacco3482_kd

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 13 1.3775 0.235 0.8906 7.4936 0.235 0.1630 0.2705 0.7664
No log 2.0 26 0.8552 0.38 0.8035 4.2966 0.38 0.2738 0.3086 0.4787
No log 3.0 39 0.6791 0.51 0.6994 3.1397 0.51 0.3754 0.3208 0.3181
No log 4.0 52 0.5747 0.59 0.6289 2.2673 0.59 0.4483 0.3491 0.2156
No log 5.0 65 0.4985 0.64 0.5620 2.3221 0.64 0.5323 0.3258 0.1637
No log 6.0 78 0.5387 0.665 0.5254 2.2659 0.665 0.5673 0.2991 0.1582
No log 7.0 91 0.5050 0.66 0.5100 2.4796 0.66 0.5360 0.2241 0.1675
No log 8.0 104 0.4247 0.685 0.4864 2.1512 0.685 0.6017 0.2848 0.1242
No log 9.0 117 0.5322 0.65 0.5101 1.9216 0.65 0.5658 0.2808 0.1775
No log 10.0 130 0.4128 0.705 0.4744 1.6466 0.705 0.6320 0.3126 0.1011
No log 11.0 143 0.4216 0.72 0.4621 2.3585 0.72 0.6365 0.2807 0.1181
No log 12.0 156 0.3925 0.72 0.4535 2.1047 0.72 0.6466 0.2976 0.0989
No log 13.0 169 0.3664 0.71 0.4406 1.6481 0.7100 0.6388 0.2789 0.1056
No log 14.0 182 0.4002 0.715 0.4396 1.8360 0.715 0.6437 0.2705 0.1096
No log 15.0 195 0.3324 0.72 0.4156 1.8256 0.72 0.6309 0.2857 0.0874
No log 16.0 208 0.3041 0.75 0.4277 1.5808 0.75 0.6746 0.3014 0.0843
No log 17.0 221 0.3178 0.78 0.4139 1.1784 0.78 0.7004 0.3016 0.0772
No log 18.0 234 0.2911 0.735 0.4210 1.4736 0.735 0.6754 0.2909 0.0844
No log 19.0 247 0.2988 0.76 0.4107 1.2700 0.76 0.6933 0.2904 0.0775
No log 20.0 260 0.2904 0.745 0.4215 1.4039 0.745 0.6686 0.2920 0.0852
No log 21.0 273 0.3022 0.77 0.4196 1.0212 0.7700 0.7040 0.3041 0.0714
No log 22.0 286 0.2748 0.73 0.4106 1.1826 0.7300 0.6715 0.2977 0.0854
No log 23.0 299 0.2835 0.745 0.4079 1.2464 0.745 0.6654 0.3083 0.0797
No log 24.0 312 0.2748 0.75 0.4089 1.1540 0.75 0.6797 0.2772 0.0802
No log 25.0 325 0.2818 0.735 0.4142 1.3465 0.735 0.6523 0.2693 0.0916
No log 26.0 338 0.2666 0.74 0.4076 1.3420 0.74 0.6560 0.2892 0.0831
No log 27.0 351 0.2693 0.745 0.4083 1.4070 0.745 0.6883 0.3074 0.0858
No log 28.0 364 0.2598 0.725 0.4007 1.3015 0.7250 0.6509 0.2874 0.0843
No log 29.0 377 0.2579 0.745 0.4023 1.2920 0.745 0.6770 0.2808 0.0788
No log 30.0 390 0.2606 0.745 0.4053 1.2203 0.745 0.6643 0.2838 0.0805
No log 31.0 403 0.2588 0.735 0.3982 1.3655 0.735 0.6743 0.2941 0.0852
No log 32.0 416 0.2524 0.74 0.3941 1.1515 0.74 0.6771 0.2735 0.0813
No log 33.0 429 0.2579 0.765 0.4002 1.3257 0.765 0.6993 0.2784 0.0753
No log 34.0 442 0.2448 0.775 0.3981 1.2289 0.775 0.7015 0.2923 0.0720
No log 35.0 455 0.2483 0.75 0.3987 1.2485 0.75 0.6645 0.2751 0.0751
No log 36.0 468 0.2417 0.765 0.3879 1.0562 0.765 0.6856 0.2827 0.0723
No log 37.0 481 0.2506 0.755 0.3944 1.2087 0.755 0.6855 0.3045 0.0744
No log 38.0 494 0.2427 0.765 0.3917 1.2356 0.765 0.6862 0.2822 0.0703
0.2351 39.0 507 0.2449 0.745 0.3958 1.2868 0.745 0.6750 0.2697 0.0762
0.2351 40.0 520 0.2413 0.755 0.3917 1.3279 0.755 0.6831 0.2720 0.0724
0.2351 41.0 533 0.2428 0.75 0.3924 1.2369 0.75 0.6781 0.2700 0.0766
0.2351 42.0 546 0.2412 0.76 0.3919 1.2235 0.76 0.6913 0.2945 0.0732
0.2351 43.0 559 0.2428 0.74 0.3968 1.3021 0.74 0.6648 0.2743 0.0773
0.2351 44.0 572 0.2400 0.75 0.3936 1.2410 0.75 0.6643 0.2789 0.0723
0.2351 45.0 585 0.2424 0.77 0.3949 1.2480 0.7700 0.7041 0.2813 0.0722
0.2351 46.0 598 0.2398 0.77 0.3931 1.2463 0.7700 0.7005 0.3050 0.0722
0.2351 47.0 611 0.2397 0.77 0.3919 1.2957 0.7700 0.6874 0.2961 0.0703
0.2351 48.0 624 0.2401 0.77 0.3926 1.2360 0.7700 0.7045 0.2945 0.0720
0.2351 49.0 637 0.2401 0.77 0.3927 1.2905 0.7700 0.6876 0.2825 0.0706
0.2351 50.0 650 0.2413 0.765 0.3936 1.2892 0.765 0.6978 0.3016 0.0743
0.2351 51.0 663 0.2410 0.77 0.3943 1.2913 0.7700 0.7005 0.2849 0.0728
0.2351 52.0 676 0.2391 0.765 0.3926 1.2846 0.765 0.6805 0.2777 0.0710
0.2351 53.0 689 0.2400 0.77 0.3929 1.2927 0.7700 0.6876 0.2698 0.0711
0.2351 54.0 702 0.2401 0.77 0.3929 1.2917 0.7700 0.6876 0.2775 0.0711
0.2351 55.0 715 0.2405 0.775 0.3934 1.2912 0.775 0.7074 0.2858 0.0709
0.2351 56.0 728 0.2403 0.775 0.3927 1.2912 0.775 0.7077 0.3059 0.0710
0.2351 57.0 741 0.2403 0.77 0.3932 1.2886 0.7700 0.6876 0.2914 0.0709
0.2351 58.0 754 0.2403 0.765 0.3932 1.2893 0.765 0.6805 0.2735 0.0717
0.2351 59.0 767 0.2405 0.77 0.3932 1.2923 0.7700 0.6876 0.2915 0.0714
0.2351 60.0 780 0.2403 0.77 0.3931 1.2893 0.7700 0.6874 0.2830 0.0708
0.2351 61.0 793 0.2405 0.77 0.3931 1.2910 0.7700 0.6876 0.2912 0.0711
0.2351 62.0 806 0.2405 0.77 0.3935 1.2905 0.7700 0.6874 0.2831 0.0711
0.2351 63.0 819 0.2405 0.765 0.3933 1.2918 0.765 0.6805 0.2776 0.0716
0.2351 64.0 832 0.2405 0.77 0.3932 1.2937 0.7700 0.6876 0.2747 0.0712
0.2351 65.0 845 0.2405 0.765 0.3934 1.2927 0.765 0.6802 0.2781 0.0717
0.2351 66.0 858 0.2406 0.77 0.3930 1.2926 0.7700 0.6876 0.2780 0.0708
0.2351 67.0 871 0.2407 0.77 0.3934 1.2933 0.7700 0.6876 0.2784 0.0712
0.2351 68.0 884 0.2407 0.77 0.3934 1.2942 0.7700 0.6876 0.2853 0.0714
0.2351 69.0 897 0.2406 0.77 0.3932 1.2920 0.7700 0.6876 0.2780 0.0709
0.2351 70.0 910 0.2403 0.77 0.3931 1.2937 0.7700 0.6874 0.2748 0.0709
0.2351 71.0 923 0.2407 0.77 0.3934 1.2929 0.7700 0.6874 0.2855 0.0710
0.2351 72.0 936 0.2405 0.77 0.3930 1.2944 0.7700 0.6874 0.2779 0.0708
0.2351 73.0 949 0.2407 0.765 0.3931 1.2919 0.765 0.6802 0.2799 0.0715
0.2351 74.0 962 0.2408 0.765 0.3933 1.2937 0.765 0.6802 0.2647 0.0716
0.2351 75.0 975 0.2407 0.765 0.3932 1.2935 0.765 0.6802 0.2728 0.0716
0.2351 76.0 988 0.2407 0.77 0.3933 1.2933 0.7700 0.6874 0.2773 0.0709
0.0004 77.0 1001 0.2407 0.77 0.3932 1.2941 0.7700 0.6874 0.2892 0.0709
0.0004 78.0 1014 0.2408 0.77 0.3933 1.2936 0.7700 0.6874 0.2820 0.0709
0.0004 79.0 1027 0.2410 0.77 0.3934 1.2931 0.7700 0.6874 0.2892 0.0710
0.0004 80.0 1040 0.2409 0.77 0.3933 1.2929 0.7700 0.6874 0.2855 0.0711
0.0004 81.0 1053 0.2409 0.77 0.3933 1.2937 0.7700 0.6874 0.2820 0.0709
0.0004 82.0 1066 0.2409 0.77 0.3934 1.2947 0.7700 0.6874 0.2819 0.0709
0.0004 83.0 1079 0.2408 0.77 0.3934 1.2933 0.7700 0.6874 0.2893 0.0709
0.0004 84.0 1092 0.2409 0.765 0.3934 1.2934 0.765 0.6802 0.2843 0.0716
0.0004 85.0 1105 0.2408 0.77 0.3933 1.2933 0.7700 0.6874 0.2893 0.0710
0.0004 86.0 1118 0.2409 0.77 0.3933 1.2940 0.7700 0.6874 0.2819 0.0710
0.0004 87.0 1131 0.2409 0.77 0.3934 1.2944 0.7700 0.6874 0.2820 0.0709
0.0004 88.0 1144 0.2409 0.77 0.3934 1.2936 0.7700 0.6874 0.2893 0.0709
0.0004 89.0 1157 0.2409 0.765 0.3933 1.2936 0.765 0.6802 0.2844 0.0717
0.0004 90.0 1170 0.2409 0.77 0.3934 1.2939 0.7700 0.6874 0.2893 0.0709
0.0004 91.0 1183 0.2409 0.765 0.3934 1.2943 0.765 0.6802 0.2843 0.0716
0.0004 92.0 1196 0.2409 0.77 0.3934 1.2942 0.7700 0.6874 0.2893 0.0709
0.0004 93.0 1209 0.2410 0.765 0.3934 1.2939 0.765 0.6802 0.2843 0.0716
0.0004 94.0 1222 0.2409 0.765 0.3934 1.2937 0.765 0.6802 0.2843 0.0716
0.0004 95.0 1235 0.2409 0.765 0.3934 1.2938 0.765 0.6802 0.2844 0.0716
0.0004 96.0 1248 0.2409 0.765 0.3934 1.2939 0.765 0.6802 0.2770 0.0716
0.0004 97.0 1261 0.2409 0.765 0.3934 1.2941 0.765 0.6802 0.2843 0.0715
0.0004 98.0 1274 0.2409 0.765 0.3934 1.2940 0.765 0.6802 0.2844 0.0715
0.0004 99.0 1287 0.2409 0.765 0.3934 1.2941 0.765 0.6802 0.2843 0.0716
0.0004 100.0 1300 0.2409 0.765 0.3934 1.2941 0.765 0.6802 0.2843 0.0716

Framework versions