<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
vit-tiny_tobacco3482
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2012
- Accuracy: 0.825
- Brier Loss: 0.3279
- Nll: 1.1568
- F1 Micro: 0.825
- F1 Macro: 0.7904
- Ece: 0.2679
- Aurc: 0.0635
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 7 | 1.7837 | 0.1 | 1.0203 | 8.4259 | 0.1000 | 0.0850 | 0.3286 | 0.9000 |
No log | 2.0 | 14 | 1.0637 | 0.25 | 0.8576 | 5.3606 | 0.25 | 0.1988 | 0.2806 | 0.7500 |
No log | 3.0 | 21 | 0.7848 | 0.41 | 0.7314 | 3.9906 | 0.41 | 0.3109 | 0.2967 | 0.3924 |
No log | 4.0 | 28 | 0.5442 | 0.565 | 0.5856 | 2.7941 | 0.565 | 0.4762 | 0.2765 | 0.2125 |
No log | 5.0 | 35 | 0.4420 | 0.68 | 0.4947 | 1.7142 | 0.68 | 0.6058 | 0.2692 | 0.1565 |
No log | 6.0 | 42 | 0.3671 | 0.7 | 0.4278 | 1.6759 | 0.7 | 0.6223 | 0.2685 | 0.1186 |
No log | 7.0 | 49 | 0.3163 | 0.765 | 0.3959 | 1.6600 | 0.765 | 0.6862 | 0.2639 | 0.1117 |
No log | 8.0 | 56 | 0.3540 | 0.77 | 0.3688 | 1.4505 | 0.7700 | 0.7148 | 0.2634 | 0.0998 |
No log | 9.0 | 63 | 0.3875 | 0.725 | 0.3934 | 1.4205 | 0.7250 | 0.6918 | 0.2622 | 0.1161 |
No log | 10.0 | 70 | 0.3546 | 0.775 | 0.3523 | 1.3197 | 0.775 | 0.7619 | 0.2558 | 0.0835 |
No log | 11.0 | 77 | 0.3046 | 0.79 | 0.3515 | 1.2730 | 0.79 | 0.7594 | 0.2526 | 0.0978 |
No log | 12.0 | 84 | 0.3059 | 0.8 | 0.3313 | 1.2700 | 0.8000 | 0.7796 | 0.2520 | 0.0783 |
No log | 13.0 | 91 | 0.3075 | 0.775 | 0.3397 | 1.2980 | 0.775 | 0.7501 | 0.2632 | 0.0702 |
No log | 14.0 | 98 | 0.2917 | 0.79 | 0.3618 | 1.3528 | 0.79 | 0.7715 | 0.2744 | 0.0843 |
No log | 15.0 | 105 | 0.2393 | 0.825 | 0.3268 | 1.1212 | 0.825 | 0.8022 | 0.2498 | 0.0656 |
No log | 16.0 | 112 | 0.2862 | 0.8 | 0.3444 | 1.2675 | 0.8000 | 0.7622 | 0.2611 | 0.0689 |
No log | 17.0 | 119 | 0.2539 | 0.79 | 0.3390 | 1.2317 | 0.79 | 0.7632 | 0.2492 | 0.0739 |
No log | 18.0 | 126 | 0.2338 | 0.82 | 0.3359 | 1.2455 | 0.82 | 0.7838 | 0.2865 | 0.0914 |
No log | 19.0 | 133 | 0.2357 | 0.825 | 0.3197 | 1.0057 | 0.825 | 0.7935 | 0.2656 | 0.0727 |
No log | 20.0 | 140 | 0.2711 | 0.81 | 0.3525 | 1.1392 | 0.81 | 0.7852 | 0.2961 | 0.0670 |
No log | 21.0 | 147 | 0.2341 | 0.795 | 0.3534 | 1.3905 | 0.795 | 0.7645 | 0.2634 | 0.0999 |
No log | 22.0 | 154 | 0.2635 | 0.795 | 0.3382 | 1.2001 | 0.795 | 0.7860 | 0.2625 | 0.0635 |
No log | 23.0 | 161 | 0.2176 | 0.82 | 0.3271 | 0.9072 | 0.82 | 0.7972 | 0.2703 | 0.0680 |
No log | 24.0 | 168 | 0.2512 | 0.835 | 0.3329 | 1.2192 | 0.835 | 0.8160 | 0.2980 | 0.0626 |
No log | 25.0 | 175 | 0.2169 | 0.805 | 0.3414 | 1.1117 | 0.805 | 0.7912 | 0.2798 | 0.0776 |
No log | 26.0 | 182 | 0.2227 | 0.84 | 0.3264 | 1.0267 | 0.8400 | 0.8267 | 0.2985 | 0.0669 |
No log | 27.0 | 189 | 0.2302 | 0.79 | 0.3342 | 1.1603 | 0.79 | 0.7708 | 0.2589 | 0.0680 |
No log | 28.0 | 196 | 0.2215 | 0.805 | 0.3324 | 1.1168 | 0.805 | 0.7786 | 0.2826 | 0.0655 |
No log | 29.0 | 203 | 0.2022 | 0.82 | 0.3217 | 0.9587 | 0.82 | 0.7874 | 0.2865 | 0.0646 |
No log | 30.0 | 210 | 0.2142 | 0.805 | 0.3287 | 1.1199 | 0.805 | 0.7855 | 0.2526 | 0.0599 |
No log | 31.0 | 217 | 0.2035 | 0.795 | 0.3272 | 1.0385 | 0.795 | 0.7717 | 0.2599 | 0.0777 |
No log | 32.0 | 224 | 0.2079 | 0.835 | 0.3246 | 0.9399 | 0.835 | 0.8045 | 0.2974 | 0.0586 |
No log | 33.0 | 231 | 0.2071 | 0.81 | 0.3173 | 1.2784 | 0.81 | 0.7848 | 0.2520 | 0.0652 |
No log | 34.0 | 238 | 0.2070 | 0.815 | 0.3217 | 1.1020 | 0.815 | 0.7855 | 0.2633 | 0.0634 |
No log | 35.0 | 245 | 0.2128 | 0.82 | 0.3235 | 1.2763 | 0.82 | 0.7800 | 0.2771 | 0.0593 |
No log | 36.0 | 252 | 0.2093 | 0.825 | 0.3221 | 1.1203 | 0.825 | 0.8030 | 0.2666 | 0.0580 |
No log | 37.0 | 259 | 0.1995 | 0.815 | 0.3240 | 1.0387 | 0.815 | 0.7831 | 0.2712 | 0.0659 |
No log | 38.0 | 266 | 0.1977 | 0.82 | 0.3207 | 1.0955 | 0.82 | 0.7846 | 0.2589 | 0.0629 |
No log | 39.0 | 273 | 0.2062 | 0.82 | 0.3235 | 1.0691 | 0.82 | 0.7911 | 0.2666 | 0.0616 |
No log | 40.0 | 280 | 0.1993 | 0.825 | 0.3266 | 1.0812 | 0.825 | 0.7973 | 0.2755 | 0.0671 |
No log | 41.0 | 287 | 0.1976 | 0.82 | 0.3288 | 1.1043 | 0.82 | 0.7948 | 0.2646 | 0.0688 |
No log | 42.0 | 294 | 0.2040 | 0.825 | 0.3308 | 1.2371 | 0.825 | 0.7964 | 0.2782 | 0.0629 |
No log | 43.0 | 301 | 0.2000 | 0.835 | 0.3224 | 1.0857 | 0.835 | 0.8041 | 0.2882 | 0.0584 |
No log | 44.0 | 308 | 0.1987 | 0.83 | 0.3222 | 1.0746 | 0.83 | 0.7959 | 0.2837 | 0.0631 |
No log | 45.0 | 315 | 0.2026 | 0.82 | 0.3248 | 1.1471 | 0.82 | 0.7887 | 0.2843 | 0.0633 |
No log | 46.0 | 322 | 0.2014 | 0.825 | 0.3258 | 1.1310 | 0.825 | 0.7916 | 0.2915 | 0.0627 |
No log | 47.0 | 329 | 0.1988 | 0.83 | 0.3237 | 1.0291 | 0.83 | 0.7959 | 0.2811 | 0.0633 |
No log | 48.0 | 336 | 0.1989 | 0.82 | 0.3273 | 1.1741 | 0.82 | 0.7871 | 0.2699 | 0.0640 |
No log | 49.0 | 343 | 0.1995 | 0.82 | 0.3251 | 1.1518 | 0.82 | 0.7869 | 0.2742 | 0.0631 |
No log | 50.0 | 350 | 0.1996 | 0.825 | 0.3241 | 1.0873 | 0.825 | 0.7900 | 0.2659 | 0.0616 |
No log | 51.0 | 357 | 0.1995 | 0.83 | 0.3248 | 1.1532 | 0.83 | 0.7933 | 0.2651 | 0.0618 |
No log | 52.0 | 364 | 0.1983 | 0.825 | 0.3251 | 1.2117 | 0.825 | 0.7904 | 0.2967 | 0.0634 |
No log | 53.0 | 371 | 0.1984 | 0.825 | 0.3254 | 1.1566 | 0.825 | 0.7904 | 0.2695 | 0.0635 |
No log | 54.0 | 378 | 0.1996 | 0.825 | 0.3249 | 1.1259 | 0.825 | 0.7904 | 0.2841 | 0.0625 |
No log | 55.0 | 385 | 0.1996 | 0.825 | 0.3252 | 1.1424 | 0.825 | 0.7904 | 0.2790 | 0.0616 |
No log | 56.0 | 392 | 0.2004 | 0.825 | 0.3243 | 1.1391 | 0.825 | 0.7904 | 0.2857 | 0.0623 |
No log | 57.0 | 399 | 0.2004 | 0.825 | 0.3259 | 1.2109 | 0.825 | 0.7904 | 0.2788 | 0.0617 |
No log | 58.0 | 406 | 0.2007 | 0.825 | 0.3262 | 1.1473 | 0.825 | 0.7900 | 0.2842 | 0.0630 |
No log | 59.0 | 413 | 0.2000 | 0.82 | 0.3264 | 1.2066 | 0.82 | 0.7871 | 0.2698 | 0.0638 |
No log | 60.0 | 420 | 0.1994 | 0.82 | 0.3263 | 1.1542 | 0.82 | 0.7871 | 0.2822 | 0.0640 |
No log | 61.0 | 427 | 0.1994 | 0.825 | 0.3261 | 1.1506 | 0.825 | 0.7904 | 0.2683 | 0.0635 |
No log | 62.0 | 434 | 0.2005 | 0.82 | 0.3276 | 1.1754 | 0.82 | 0.7871 | 0.2767 | 0.0639 |
No log | 63.0 | 441 | 0.2006 | 0.82 | 0.3269 | 1.2127 | 0.82 | 0.7871 | 0.2811 | 0.0635 |
No log | 64.0 | 448 | 0.2003 | 0.825 | 0.3265 | 1.1547 | 0.825 | 0.7904 | 0.2814 | 0.0630 |
No log | 65.0 | 455 | 0.2005 | 0.825 | 0.3268 | 1.1078 | 0.825 | 0.7904 | 0.3069 | 0.0629 |
No log | 66.0 | 462 | 0.2006 | 0.825 | 0.3268 | 1.0998 | 0.825 | 0.7904 | 0.3012 | 0.0627 |
No log | 67.0 | 469 | 0.2009 | 0.825 | 0.3265 | 1.1526 | 0.825 | 0.7904 | 0.2946 | 0.0623 |
No log | 68.0 | 476 | 0.2006 | 0.825 | 0.3269 | 1.1500 | 0.825 | 0.7904 | 0.2983 | 0.0631 |
No log | 69.0 | 483 | 0.2007 | 0.825 | 0.3272 | 1.1005 | 0.825 | 0.7904 | 0.2756 | 0.0635 |
No log | 70.0 | 490 | 0.2003 | 0.825 | 0.3271 | 1.0947 | 0.825 | 0.7904 | 0.2697 | 0.0635 |
No log | 71.0 | 497 | 0.2006 | 0.825 | 0.3272 | 1.1587 | 0.825 | 0.7904 | 0.2770 | 0.0636 |
0.1375 | 72.0 | 504 | 0.2008 | 0.825 | 0.3272 | 1.1549 | 0.825 | 0.7904 | 0.2749 | 0.0632 |
0.1375 | 73.0 | 511 | 0.2009 | 0.825 | 0.3272 | 1.1528 | 0.825 | 0.7904 | 0.2748 | 0.0624 |
0.1375 | 74.0 | 518 | 0.2013 | 0.825 | 0.3275 | 1.1528 | 0.825 | 0.7904 | 0.2757 | 0.0626 |
0.1375 | 75.0 | 525 | 0.2009 | 0.825 | 0.3277 | 1.1529 | 0.825 | 0.7904 | 0.2763 | 0.0632 |
0.1375 | 76.0 | 532 | 0.2009 | 0.825 | 0.3271 | 1.1526 | 0.825 | 0.7904 | 0.2754 | 0.0633 |
0.1375 | 77.0 | 539 | 0.2006 | 0.825 | 0.3274 | 1.1559 | 0.825 | 0.7904 | 0.2699 | 0.0636 |
0.1375 | 78.0 | 546 | 0.2005 | 0.825 | 0.3272 | 1.1499 | 0.825 | 0.7904 | 0.2755 | 0.0634 |
0.1375 | 79.0 | 553 | 0.2009 | 0.825 | 0.3277 | 1.1539 | 0.825 | 0.7904 | 0.2833 | 0.0634 |
0.1375 | 80.0 | 560 | 0.2009 | 0.825 | 0.3275 | 1.1551 | 0.825 | 0.7904 | 0.2751 | 0.0632 |
0.1375 | 81.0 | 567 | 0.2013 | 0.825 | 0.3276 | 1.1563 | 0.825 | 0.7904 | 0.2809 | 0.0634 |
0.1375 | 82.0 | 574 | 0.2010 | 0.825 | 0.3277 | 1.1545 | 0.825 | 0.7904 | 0.2752 | 0.0633 |
0.1375 | 83.0 | 581 | 0.2009 | 0.825 | 0.3275 | 1.1565 | 0.825 | 0.7904 | 0.2753 | 0.0634 |
0.1375 | 84.0 | 588 | 0.2009 | 0.825 | 0.3277 | 1.1564 | 0.825 | 0.7904 | 0.2817 | 0.0636 |
0.1375 | 85.0 | 595 | 0.2010 | 0.825 | 0.3277 | 1.1560 | 0.825 | 0.7904 | 0.2686 | 0.0633 |
0.1375 | 86.0 | 602 | 0.2010 | 0.825 | 0.3278 | 1.1560 | 0.825 | 0.7904 | 0.2755 | 0.0633 |
0.1375 | 87.0 | 609 | 0.2010 | 0.825 | 0.3277 | 1.1544 | 0.825 | 0.7904 | 0.2661 | 0.0634 |
0.1375 | 88.0 | 616 | 0.2012 | 0.825 | 0.3278 | 1.1571 | 0.825 | 0.7904 | 0.2612 | 0.0633 |
0.1375 | 89.0 | 623 | 0.2010 | 0.825 | 0.3278 | 1.1558 | 0.825 | 0.7904 | 0.2747 | 0.0635 |
0.1375 | 90.0 | 630 | 0.2010 | 0.825 | 0.3278 | 1.1564 | 0.825 | 0.7904 | 0.2687 | 0.0634 |
0.1375 | 91.0 | 637 | 0.2011 | 0.825 | 0.3278 | 1.1551 | 0.825 | 0.7904 | 0.2678 | 0.0633 |
0.1375 | 92.0 | 644 | 0.2011 | 0.825 | 0.3278 | 1.1559 | 0.825 | 0.7904 | 0.2759 | 0.0635 |
0.1375 | 93.0 | 651 | 0.2011 | 0.825 | 0.3279 | 1.1565 | 0.825 | 0.7904 | 0.2750 | 0.0633 |
0.1375 | 94.0 | 658 | 0.2011 | 0.825 | 0.3278 | 1.1567 | 0.825 | 0.7904 | 0.2760 | 0.0635 |
0.1375 | 95.0 | 665 | 0.2011 | 0.825 | 0.3279 | 1.1572 | 0.825 | 0.7904 | 0.2679 | 0.0633 |
0.1375 | 96.0 | 672 | 0.2012 | 0.825 | 0.3279 | 1.1574 | 0.825 | 0.7904 | 0.2679 | 0.0634 |
0.1375 | 97.0 | 679 | 0.2012 | 0.825 | 0.3279 | 1.1563 | 0.825 | 0.7904 | 0.2678 | 0.0634 |
0.1375 | 98.0 | 686 | 0.2012 | 0.825 | 0.3279 | 1.1567 | 0.825 | 0.7904 | 0.2678 | 0.0634 |
0.1375 | 99.0 | 693 | 0.2012 | 0.825 | 0.3279 | 1.1569 | 0.825 | 0.7904 | 0.2679 | 0.0635 |
0.1375 | 100.0 | 700 | 0.2012 | 0.825 | 0.3279 | 1.1568 | 0.825 | 0.7904 | 0.2679 | 0.0635 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2