<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
225-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4875
- Accuracy: 0.805
- Brier Loss: 0.3011
- Nll: 1.4097
- F1 Micro: 0.805
- F1 Macro: 0.7862
- Ece: 0.2181
- Aurc: 0.0530
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 13 | 1.7287 | 0.235 | 0.8932 | 7.9209 | 0.235 | 0.1394 | 0.3142 | 0.7431 |
No log | 2.0 | 26 | 1.2278 | 0.455 | 0.6937 | 3.7028 | 0.455 | 0.3630 | 0.2894 | 0.3370 |
No log | 3.0 | 39 | 1.0392 | 0.545 | 0.5804 | 2.0622 | 0.545 | 0.4755 | 0.2549 | 0.2370 |
No log | 4.0 | 52 | 0.8315 | 0.635 | 0.4677 | 1.8368 | 0.635 | 0.5858 | 0.2429 | 0.1436 |
No log | 5.0 | 65 | 0.7488 | 0.72 | 0.4187 | 1.4849 | 0.72 | 0.6856 | 0.2558 | 0.1081 |
No log | 6.0 | 78 | 0.7076 | 0.735 | 0.3886 | 1.4403 | 0.735 | 0.6840 | 0.2107 | 0.1002 |
No log | 7.0 | 91 | 0.6946 | 0.725 | 0.3876 | 1.5056 | 0.7250 | 0.7017 | 0.2229 | 0.0982 |
No log | 8.0 | 104 | 0.7344 | 0.72 | 0.3928 | 1.8498 | 0.72 | 0.6796 | 0.1817 | 0.1016 |
No log | 9.0 | 117 | 0.7380 | 0.735 | 0.4000 | 1.8338 | 0.735 | 0.7369 | 0.2382 | 0.1032 |
No log | 10.0 | 130 | 0.6790 | 0.76 | 0.3677 | 1.8100 | 0.76 | 0.7343 | 0.2346 | 0.1003 |
No log | 11.0 | 143 | 0.7008 | 0.735 | 0.3997 | 1.8469 | 0.735 | 0.7211 | 0.2510 | 0.1226 |
No log | 12.0 | 156 | 0.6378 | 0.74 | 0.3762 | 1.7911 | 0.74 | 0.7187 | 0.2214 | 0.0869 |
No log | 13.0 | 169 | 0.6066 | 0.745 | 0.3544 | 1.6249 | 0.745 | 0.7012 | 0.2009 | 0.0823 |
No log | 14.0 | 182 | 0.5888 | 0.77 | 0.3365 | 1.7603 | 0.7700 | 0.7568 | 0.2047 | 0.0613 |
No log | 15.0 | 195 | 0.5817 | 0.765 | 0.3430 | 1.6581 | 0.765 | 0.7411 | 0.2345 | 0.0661 |
No log | 16.0 | 208 | 0.5510 | 0.795 | 0.3265 | 1.3740 | 0.795 | 0.7815 | 0.2347 | 0.0590 |
No log | 17.0 | 221 | 0.5449 | 0.77 | 0.3362 | 1.3348 | 0.7700 | 0.7541 | 0.1971 | 0.0709 |
No log | 18.0 | 234 | 0.5686 | 0.775 | 0.3363 | 1.8806 | 0.775 | 0.7604 | 0.2006 | 0.0654 |
No log | 19.0 | 247 | 0.5499 | 0.82 | 0.3237 | 1.3751 | 0.82 | 0.7981 | 0.2471 | 0.0586 |
No log | 20.0 | 260 | 0.5321 | 0.77 | 0.3213 | 1.5905 | 0.7700 | 0.7474 | 0.2332 | 0.0643 |
No log | 21.0 | 273 | 0.5349 | 0.805 | 0.3050 | 2.0032 | 0.805 | 0.7788 | 0.2238 | 0.0533 |
No log | 22.0 | 286 | 0.5318 | 0.8 | 0.3105 | 1.5868 | 0.8000 | 0.7620 | 0.2377 | 0.0538 |
No log | 23.0 | 299 | 0.5021 | 0.83 | 0.2982 | 1.4067 | 0.83 | 0.8190 | 0.2631 | 0.0463 |
No log | 24.0 | 312 | 0.5008 | 0.805 | 0.3023 | 1.4409 | 0.805 | 0.7863 | 0.2248 | 0.0501 |
No log | 25.0 | 325 | 0.5069 | 0.805 | 0.3036 | 1.4965 | 0.805 | 0.7770 | 0.2218 | 0.0519 |
No log | 26.0 | 338 | 0.4967 | 0.8 | 0.3002 | 1.6267 | 0.8000 | 0.7788 | 0.2188 | 0.0598 |
No log | 27.0 | 351 | 0.4892 | 0.81 | 0.3006 | 1.6391 | 0.81 | 0.7886 | 0.2170 | 0.0513 |
No log | 28.0 | 364 | 0.5099 | 0.82 | 0.3129 | 1.5802 | 0.82 | 0.8004 | 0.2285 | 0.0589 |
No log | 29.0 | 377 | 0.5009 | 0.8 | 0.3054 | 1.5187 | 0.8000 | 0.7747 | 0.2260 | 0.0570 |
No log | 30.0 | 390 | 0.4869 | 0.805 | 0.2989 | 1.4292 | 0.805 | 0.7823 | 0.2380 | 0.0511 |
No log | 31.0 | 403 | 0.4876 | 0.82 | 0.2970 | 1.4254 | 0.82 | 0.7984 | 0.2293 | 0.0524 |
No log | 32.0 | 416 | 0.4916 | 0.81 | 0.3024 | 1.5657 | 0.81 | 0.7872 | 0.2239 | 0.0557 |
No log | 33.0 | 429 | 0.4834 | 0.805 | 0.2969 | 1.5227 | 0.805 | 0.7939 | 0.2108 | 0.0537 |
No log | 34.0 | 442 | 0.4910 | 0.8 | 0.3074 | 1.4463 | 0.8000 | 0.7745 | 0.2236 | 0.0580 |
No log | 35.0 | 455 | 0.4854 | 0.805 | 0.2990 | 1.4106 | 0.805 | 0.7875 | 0.2280 | 0.0547 |
No log | 36.0 | 468 | 0.4861 | 0.815 | 0.2985 | 1.4682 | 0.815 | 0.7921 | 0.2310 | 0.0527 |
No log | 37.0 | 481 | 0.4880 | 0.8 | 0.3032 | 1.4765 | 0.8000 | 0.7743 | 0.2174 | 0.0565 |
No log | 38.0 | 494 | 0.4871 | 0.805 | 0.2993 | 1.4592 | 0.805 | 0.7854 | 0.2072 | 0.0551 |
0.3005 | 39.0 | 507 | 0.4908 | 0.805 | 0.3037 | 1.4704 | 0.805 | 0.7854 | 0.2269 | 0.0575 |
0.3005 | 40.0 | 520 | 0.4893 | 0.805 | 0.3018 | 1.3980 | 0.805 | 0.7862 | 0.2105 | 0.0555 |
0.3005 | 41.0 | 533 | 0.4866 | 0.8 | 0.3016 | 1.4087 | 0.8000 | 0.7766 | 0.2219 | 0.0547 |
0.3005 | 42.0 | 546 | 0.4851 | 0.805 | 0.2997 | 1.3968 | 0.805 | 0.7862 | 0.2110 | 0.0536 |
0.3005 | 43.0 | 559 | 0.4859 | 0.805 | 0.3011 | 1.4078 | 0.805 | 0.7875 | 0.2126 | 0.0545 |
0.3005 | 44.0 | 572 | 0.4869 | 0.805 | 0.3011 | 1.4629 | 0.805 | 0.7862 | 0.2122 | 0.0546 |
0.3005 | 45.0 | 585 | 0.4868 | 0.805 | 0.3010 | 1.4646 | 0.805 | 0.7854 | 0.2151 | 0.0549 |
0.3005 | 46.0 | 598 | 0.4870 | 0.805 | 0.3012 | 1.4644 | 0.805 | 0.7854 | 0.2110 | 0.0544 |
0.3005 | 47.0 | 611 | 0.4858 | 0.805 | 0.2999 | 1.4066 | 0.805 | 0.7875 | 0.2180 | 0.0534 |
0.3005 | 48.0 | 624 | 0.4866 | 0.805 | 0.3014 | 1.4032 | 0.805 | 0.7875 | 0.2265 | 0.0538 |
0.3005 | 49.0 | 637 | 0.4854 | 0.805 | 0.2996 | 1.4117 | 0.805 | 0.7862 | 0.2156 | 0.0534 |
0.3005 | 50.0 | 650 | 0.4860 | 0.805 | 0.3003 | 1.4683 | 0.805 | 0.7854 | 0.2100 | 0.0533 |
0.3005 | 51.0 | 663 | 0.4860 | 0.805 | 0.3002 | 1.4041 | 0.805 | 0.7854 | 0.2352 | 0.0534 |
0.3005 | 52.0 | 676 | 0.4872 | 0.805 | 0.3015 | 1.4067 | 0.805 | 0.7875 | 0.2033 | 0.0540 |
0.3005 | 53.0 | 689 | 0.4866 | 0.805 | 0.3005 | 1.4105 | 0.805 | 0.7875 | 0.2310 | 0.0538 |
0.3005 | 54.0 | 702 | 0.4861 | 0.805 | 0.3006 | 1.4036 | 0.805 | 0.7875 | 0.2340 | 0.0533 |
0.3005 | 55.0 | 715 | 0.4864 | 0.805 | 0.3005 | 1.4063 | 0.805 | 0.7875 | 0.2199 | 0.0537 |
0.3005 | 56.0 | 728 | 0.4871 | 0.805 | 0.3009 | 1.4091 | 0.805 | 0.7862 | 0.2282 | 0.0537 |
0.3005 | 57.0 | 741 | 0.4869 | 0.805 | 0.3007 | 1.4079 | 0.805 | 0.7862 | 0.2214 | 0.0531 |
0.3005 | 58.0 | 754 | 0.4864 | 0.805 | 0.3005 | 1.4086 | 0.805 | 0.7862 | 0.2206 | 0.0532 |
0.3005 | 59.0 | 767 | 0.4868 | 0.805 | 0.3007 | 1.4133 | 0.805 | 0.7862 | 0.2372 | 0.0531 |
0.3005 | 60.0 | 780 | 0.4871 | 0.805 | 0.3009 | 1.4079 | 0.805 | 0.7875 | 0.2172 | 0.0534 |
0.3005 | 61.0 | 793 | 0.4875 | 0.805 | 0.3014 | 1.4106 | 0.805 | 0.7862 | 0.2295 | 0.0536 |
0.3005 | 62.0 | 806 | 0.4875 | 0.805 | 0.3013 | 1.4136 | 0.805 | 0.7875 | 0.2219 | 0.0535 |
0.3005 | 63.0 | 819 | 0.4874 | 0.805 | 0.3013 | 1.4085 | 0.805 | 0.7862 | 0.2189 | 0.0534 |
0.3005 | 64.0 | 832 | 0.4867 | 0.805 | 0.3007 | 1.4075 | 0.805 | 0.7862 | 0.2325 | 0.0530 |
0.3005 | 65.0 | 845 | 0.4876 | 0.805 | 0.3013 | 1.4122 | 0.805 | 0.7862 | 0.2379 | 0.0537 |
0.3005 | 66.0 | 858 | 0.4878 | 0.805 | 0.3015 | 1.4090 | 0.805 | 0.7862 | 0.2220 | 0.0536 |
0.3005 | 67.0 | 871 | 0.4869 | 0.805 | 0.3007 | 1.4101 | 0.805 | 0.7862 | 0.2253 | 0.0529 |
0.3005 | 68.0 | 884 | 0.4871 | 0.805 | 0.3009 | 1.4096 | 0.805 | 0.7862 | 0.2340 | 0.0530 |
0.3005 | 69.0 | 897 | 0.4873 | 0.805 | 0.3010 | 1.4120 | 0.805 | 0.7862 | 0.2138 | 0.0534 |
0.3005 | 70.0 | 910 | 0.4874 | 0.805 | 0.3011 | 1.4121 | 0.805 | 0.7862 | 0.2292 | 0.0533 |
0.3005 | 71.0 | 923 | 0.4874 | 0.805 | 0.3012 | 1.4095 | 0.805 | 0.7862 | 0.2276 | 0.0532 |
0.3005 | 72.0 | 936 | 0.4870 | 0.805 | 0.3009 | 1.4083 | 0.805 | 0.7862 | 0.2262 | 0.0532 |
0.3005 | 73.0 | 949 | 0.4877 | 0.805 | 0.3013 | 1.4115 | 0.805 | 0.7862 | 0.2273 | 0.0533 |
0.3005 | 74.0 | 962 | 0.4872 | 0.805 | 0.3010 | 1.4109 | 0.805 | 0.7862 | 0.2275 | 0.0533 |
0.3005 | 75.0 | 975 | 0.4874 | 0.805 | 0.3010 | 1.4100 | 0.805 | 0.7862 | 0.2186 | 0.0533 |
0.3005 | 76.0 | 988 | 0.4874 | 0.805 | 0.3011 | 1.4095 | 0.805 | 0.7862 | 0.2174 | 0.0532 |
0.0815 | 77.0 | 1001 | 0.4876 | 0.805 | 0.3012 | 1.4096 | 0.805 | 0.7862 | 0.2185 | 0.0533 |
0.0815 | 78.0 | 1014 | 0.4875 | 0.805 | 0.3011 | 1.4114 | 0.805 | 0.7862 | 0.2189 | 0.0532 |
0.0815 | 79.0 | 1027 | 0.4874 | 0.805 | 0.3011 | 1.4092 | 0.805 | 0.7862 | 0.2347 | 0.0533 |
0.0815 | 80.0 | 1040 | 0.4877 | 0.805 | 0.3012 | 1.4110 | 0.805 | 0.7862 | 0.2272 | 0.0532 |
0.0815 | 81.0 | 1053 | 0.4876 | 0.805 | 0.3012 | 1.4092 | 0.805 | 0.7862 | 0.2259 | 0.0532 |
0.0815 | 82.0 | 1066 | 0.4873 | 0.805 | 0.3009 | 1.4103 | 0.805 | 0.7862 | 0.2171 | 0.0531 |
0.0815 | 83.0 | 1079 | 0.4875 | 0.805 | 0.3011 | 1.4091 | 0.805 | 0.7862 | 0.2260 | 0.0532 |
0.0815 | 84.0 | 1092 | 0.4875 | 0.805 | 0.3010 | 1.4108 | 0.805 | 0.7862 | 0.2346 | 0.0532 |
0.0815 | 85.0 | 1105 | 0.4876 | 0.805 | 0.3012 | 1.4098 | 0.805 | 0.7862 | 0.2276 | 0.0531 |
0.0815 | 86.0 | 1118 | 0.4876 | 0.805 | 0.3011 | 1.4127 | 0.805 | 0.7862 | 0.2272 | 0.0532 |
0.0815 | 87.0 | 1131 | 0.4875 | 0.805 | 0.3011 | 1.4093 | 0.805 | 0.7862 | 0.2275 | 0.0531 |
0.0815 | 88.0 | 1144 | 0.4876 | 0.805 | 0.3011 | 1.4092 | 0.805 | 0.7862 | 0.2184 | 0.0531 |
0.0815 | 89.0 | 1157 | 0.4874 | 0.805 | 0.3011 | 1.4086 | 0.805 | 0.7862 | 0.2271 | 0.0531 |
0.0815 | 90.0 | 1170 | 0.4875 | 0.805 | 0.3011 | 1.4098 | 0.805 | 0.7862 | 0.2272 | 0.0531 |
0.0815 | 91.0 | 1183 | 0.4875 | 0.805 | 0.3011 | 1.4104 | 0.805 | 0.7862 | 0.2275 | 0.0531 |
0.0815 | 92.0 | 1196 | 0.4874 | 0.805 | 0.3010 | 1.4092 | 0.805 | 0.7862 | 0.2183 | 0.0531 |
0.0815 | 93.0 | 1209 | 0.4876 | 0.805 | 0.3011 | 1.4097 | 0.805 | 0.7862 | 0.2276 | 0.0531 |
0.0815 | 94.0 | 1222 | 0.4874 | 0.805 | 0.3009 | 1.4095 | 0.805 | 0.7862 | 0.2180 | 0.0529 |
0.0815 | 95.0 | 1235 | 0.4874 | 0.805 | 0.3011 | 1.4092 | 0.805 | 0.7862 | 0.2182 | 0.0531 |
0.0815 | 96.0 | 1248 | 0.4875 | 0.805 | 0.3011 | 1.4100 | 0.805 | 0.7862 | 0.2183 | 0.0531 |
0.0815 | 97.0 | 1261 | 0.4875 | 0.805 | 0.3011 | 1.4097 | 0.805 | 0.7862 | 0.2181 | 0.0530 |
0.0815 | 98.0 | 1274 | 0.4876 | 0.805 | 0.3011 | 1.4098 | 0.805 | 0.7862 | 0.2181 | 0.0530 |
0.0815 | 99.0 | 1287 | 0.4875 | 0.805 | 0.3011 | 1.4097 | 0.805 | 0.7862 | 0.2273 | 0.0531 |
0.0815 | 100.0 | 1300 | 0.4875 | 0.805 | 0.3011 | 1.4097 | 0.805 | 0.7862 | 0.2181 | 0.0530 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2