<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
18-tiny_tobacco3482_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 4.0957
- Accuracy: 0.805
- Brier Loss: 0.2927
- Nll: 1.1753
- F1 Micro: 0.805
- F1 Macro: 0.7833
- Ece: 0.1572
- Aurc: 0.0655
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 7 | 4.7898 | 0.1 | 1.0292 | 9.4902 | 0.1000 | 0.0772 | 0.3220 | 0.9001 |
No log | 2.0 | 14 | 3.9970 | 0.1 | 0.9420 | 10.0981 | 0.1000 | 0.1071 | 0.2441 | 0.8581 |
No log | 3.0 | 21 | 3.6641 | 0.075 | 0.8956 | 9.5324 | 0.075 | 0.0777 | 0.1896 | 0.9137 |
No log | 4.0 | 28 | 3.6014 | 0.18 | 0.8691 | 9.6679 | 0.18 | 0.0781 | 0.2345 | 0.5824 |
No log | 5.0 | 35 | 3.5833 | 0.23 | 0.8347 | 9.6569 | 0.23 | 0.1572 | 0.2618 | 0.5205 |
No log | 6.0 | 42 | 3.5576 | 0.44 | 0.7860 | 5.9410 | 0.44 | 0.2946 | 0.3475 | 0.3232 |
No log | 7.0 | 49 | 3.5400 | 0.575 | 0.7404 | 4.2387 | 0.575 | 0.4638 | 0.4007 | 0.2294 |
No log | 8.0 | 56 | 3.5319 | 0.545 | 0.7181 | 4.5958 | 0.545 | 0.4482 | 0.3502 | 0.2374 |
No log | 9.0 | 63 | 3.5405 | 0.52 | 0.7002 | 3.9862 | 0.52 | 0.4101 | 0.3148 | 0.2506 |
No log | 10.0 | 70 | 3.5341 | 0.61 | 0.6897 | 3.2707 | 0.61 | 0.5118 | 0.3775 | 0.2235 |
No log | 11.0 | 77 | 3.5259 | 0.66 | 0.6771 | 2.6882 | 0.66 | 0.5201 | 0.4365 | 0.1420 |
No log | 12.0 | 84 | 3.5215 | 0.66 | 0.6463 | 2.4544 | 0.66 | 0.5387 | 0.3750 | 0.1664 |
No log | 13.0 | 91 | 3.5363 | 0.58 | 0.6232 | 2.3149 | 0.58 | 0.5090 | 0.3285 | 0.1858 |
No log | 14.0 | 98 | 3.5161 | 0.675 | 0.6008 | 2.6144 | 0.675 | 0.5411 | 0.3690 | 0.1237 |
No log | 15.0 | 105 | 3.5073 | 0.67 | 0.5845 | 2.1229 | 0.67 | 0.5577 | 0.3405 | 0.1350 |
No log | 16.0 | 112 | 3.5272 | 0.67 | 0.5338 | 2.4215 | 0.67 | 0.5603 | 0.3154 | 0.1325 |
No log | 17.0 | 119 | 3.5332 | 0.695 | 0.5367 | 2.1675 | 0.695 | 0.6056 | 0.3140 | 0.1071 |
No log | 18.0 | 126 | 3.5659 | 0.655 | 0.4841 | 1.9565 | 0.655 | 0.5559 | 0.2600 | 0.1365 |
No log | 19.0 | 133 | 3.5438 | 0.69 | 0.4817 | 1.8201 | 0.69 | 0.5735 | 0.2574 | 0.1202 |
No log | 20.0 | 140 | 3.5019 | 0.74 | 0.4725 | 1.6346 | 0.74 | 0.6486 | 0.2939 | 0.0931 |
No log | 21.0 | 147 | 3.5236 | 0.755 | 0.4407 | 1.3134 | 0.755 | 0.6811 | 0.2762 | 0.0820 |
No log | 22.0 | 154 | 3.5303 | 0.755 | 0.4143 | 1.2834 | 0.755 | 0.6843 | 0.2434 | 0.0806 |
No log | 23.0 | 161 | 3.5541 | 0.77 | 0.4034 | 1.4417 | 0.7700 | 0.6891 | 0.2382 | 0.0842 |
No log | 24.0 | 168 | 3.5675 | 0.765 | 0.3853 | 1.6692 | 0.765 | 0.7072 | 0.2309 | 0.0807 |
No log | 25.0 | 175 | 3.5411 | 0.745 | 0.3914 | 1.2777 | 0.745 | 0.6720 | 0.2271 | 0.0784 |
No log | 26.0 | 182 | 3.5877 | 0.75 | 0.3710 | 1.4838 | 0.75 | 0.6717 | 0.2082 | 0.0789 |
No log | 27.0 | 189 | 3.6026 | 0.77 | 0.3483 | 1.4211 | 0.7700 | 0.7018 | 0.2089 | 0.0694 |
No log | 28.0 | 196 | 3.6374 | 0.78 | 0.3365 | 1.3205 | 0.78 | 0.7181 | 0.1953 | 0.0694 |
No log | 29.0 | 203 | 3.7319 | 0.775 | 0.3538 | 1.2749 | 0.775 | 0.7012 | 0.2149 | 0.0814 |
No log | 30.0 | 210 | 3.6359 | 0.805 | 0.3291 | 1.3272 | 0.805 | 0.7761 | 0.1991 | 0.0637 |
No log | 31.0 | 217 | 3.7160 | 0.785 | 0.3337 | 1.2632 | 0.785 | 0.7445 | 0.1727 | 0.0757 |
No log | 32.0 | 224 | 3.6810 | 0.8 | 0.3234 | 1.3720 | 0.8000 | 0.7636 | 0.1999 | 0.0649 |
No log | 33.0 | 231 | 3.7139 | 0.82 | 0.3221 | 1.2150 | 0.82 | 0.7919 | 0.2051 | 0.0677 |
No log | 34.0 | 238 | 3.7286 | 0.795 | 0.3130 | 1.0622 | 0.795 | 0.7575 | 0.1919 | 0.0639 |
No log | 35.0 | 245 | 3.7807 | 0.795 | 0.3154 | 1.0146 | 0.795 | 0.7672 | 0.1565 | 0.0714 |
No log | 36.0 | 252 | 3.6802 | 0.815 | 0.3131 | 1.0083 | 0.815 | 0.7933 | 0.2051 | 0.0626 |
No log | 37.0 | 259 | 3.7369 | 0.81 | 0.3168 | 1.0017 | 0.81 | 0.7862 | 0.1792 | 0.0690 |
No log | 38.0 | 266 | 3.7638 | 0.82 | 0.2971 | 1.3357 | 0.82 | 0.7977 | 0.1913 | 0.0628 |
No log | 39.0 | 273 | 3.7415 | 0.825 | 0.2954 | 1.0423 | 0.825 | 0.8072 | 0.1893 | 0.0599 |
No log | 40.0 | 280 | 3.8005 | 0.785 | 0.3140 | 1.0817 | 0.785 | 0.7453 | 0.1694 | 0.0684 |
No log | 41.0 | 287 | 3.7901 | 0.82 | 0.3127 | 1.0853 | 0.82 | 0.7993 | 0.1789 | 0.0673 |
No log | 42.0 | 294 | 3.7811 | 0.825 | 0.3019 | 1.2712 | 0.825 | 0.8020 | 0.1644 | 0.0644 |
No log | 43.0 | 301 | 3.7689 | 0.81 | 0.3110 | 0.8553 | 0.81 | 0.7932 | 0.1785 | 0.0645 |
No log | 44.0 | 308 | 3.7796 | 0.82 | 0.2919 | 1.2589 | 0.82 | 0.7972 | 0.1875 | 0.0643 |
No log | 45.0 | 315 | 3.8005 | 0.805 | 0.3036 | 1.1993 | 0.805 | 0.7789 | 0.1840 | 0.0660 |
No log | 46.0 | 322 | 3.7811 | 0.82 | 0.2909 | 1.0962 | 0.82 | 0.8004 | 0.1735 | 0.0618 |
No log | 47.0 | 329 | 3.8145 | 0.8 | 0.3040 | 1.1968 | 0.8000 | 0.7759 | 0.1795 | 0.0671 |
No log | 48.0 | 336 | 3.7969 | 0.835 | 0.2816 | 1.1019 | 0.835 | 0.8118 | 0.1624 | 0.0603 |
No log | 49.0 | 343 | 3.8020 | 0.815 | 0.2855 | 1.0383 | 0.815 | 0.7978 | 0.1556 | 0.0639 |
No log | 50.0 | 350 | 3.8049 | 0.815 | 0.2884 | 1.1121 | 0.815 | 0.7935 | 0.1608 | 0.0616 |
No log | 51.0 | 357 | 3.8048 | 0.81 | 0.2873 | 1.1173 | 0.81 | 0.7898 | 0.1574 | 0.0632 |
No log | 52.0 | 364 | 3.8581 | 0.8 | 0.2923 | 1.1257 | 0.8000 | 0.7767 | 0.1436 | 0.0664 |
No log | 53.0 | 371 | 3.8565 | 0.79 | 0.2984 | 1.0513 | 0.79 | 0.7670 | 0.1622 | 0.0668 |
No log | 54.0 | 378 | 3.8787 | 0.805 | 0.2901 | 1.0619 | 0.805 | 0.7874 | 0.1335 | 0.0655 |
No log | 55.0 | 385 | 3.8777 | 0.805 | 0.2940 | 1.0378 | 0.805 | 0.7883 | 0.1450 | 0.0647 |
No log | 56.0 | 392 | 3.8743 | 0.805 | 0.2906 | 1.1702 | 0.805 | 0.7849 | 0.1610 | 0.0634 |
No log | 57.0 | 399 | 3.9082 | 0.795 | 0.2959 | 1.0951 | 0.795 | 0.7711 | 0.1761 | 0.0662 |
No log | 58.0 | 406 | 3.8894 | 0.8 | 0.2898 | 1.0979 | 0.8000 | 0.7816 | 0.1774 | 0.0638 |
No log | 59.0 | 413 | 3.9005 | 0.825 | 0.2914 | 1.2358 | 0.825 | 0.8088 | 0.1687 | 0.0637 |
No log | 60.0 | 420 | 3.9115 | 0.815 | 0.2863 | 1.0318 | 0.815 | 0.7928 | 0.1672 | 0.0640 |
No log | 61.0 | 427 | 3.9172 | 0.805 | 0.2956 | 1.1397 | 0.805 | 0.7884 | 0.1646 | 0.0667 |
No log | 62.0 | 434 | 3.8993 | 0.82 | 0.2862 | 1.2349 | 0.82 | 0.8001 | 0.1544 | 0.0645 |
No log | 63.0 | 441 | 3.9334 | 0.825 | 0.2896 | 1.1718 | 0.825 | 0.8061 | 0.1662 | 0.0646 |
No log | 64.0 | 448 | 3.9179 | 0.815 | 0.2861 | 1.1727 | 0.815 | 0.7966 | 0.1592 | 0.0650 |
No log | 65.0 | 455 | 3.9489 | 0.8 | 0.2981 | 1.1681 | 0.8000 | 0.7805 | 0.1522 | 0.0674 |
No log | 66.0 | 462 | 3.9372 | 0.81 | 0.2855 | 1.1041 | 0.81 | 0.7870 | 0.1709 | 0.0647 |
No log | 67.0 | 469 | 3.9651 | 0.8 | 0.2935 | 1.1723 | 0.8000 | 0.7816 | 0.1492 | 0.0667 |
No log | 68.0 | 476 | 3.9600 | 0.815 | 0.2903 | 1.1687 | 0.815 | 0.7950 | 0.1466 | 0.0650 |
No log | 69.0 | 483 | 3.9695 | 0.82 | 0.2908 | 1.1251 | 0.82 | 0.8026 | 0.1532 | 0.0654 |
No log | 70.0 | 490 | 3.9817 | 0.805 | 0.2915 | 1.1879 | 0.805 | 0.7861 | 0.1537 | 0.0657 |
No log | 71.0 | 497 | 3.9838 | 0.81 | 0.2899 | 1.1688 | 0.81 | 0.7892 | 0.1538 | 0.0648 |
3.4085 | 72.0 | 504 | 3.9960 | 0.805 | 0.2910 | 1.1702 | 0.805 | 0.7904 | 0.1568 | 0.0657 |
3.4085 | 73.0 | 511 | 4.0046 | 0.8 | 0.2931 | 1.1743 | 0.8000 | 0.7800 | 0.1529 | 0.0658 |
3.4085 | 74.0 | 518 | 4.0115 | 0.815 | 0.2917 | 1.1718 | 0.815 | 0.7968 | 0.1589 | 0.0647 |
3.4085 | 75.0 | 525 | 4.0205 | 0.805 | 0.2920 | 1.1719 | 0.805 | 0.7833 | 0.1575 | 0.0654 |
3.4085 | 76.0 | 532 | 4.0272 | 0.805 | 0.2919 | 1.1725 | 0.805 | 0.7833 | 0.1547 | 0.0659 |
3.4085 | 77.0 | 539 | 4.0323 | 0.81 | 0.2923 | 1.1720 | 0.81 | 0.7892 | 0.1547 | 0.0653 |
3.4085 | 78.0 | 546 | 4.0364 | 0.81 | 0.2907 | 1.1715 | 0.81 | 0.7892 | 0.1607 | 0.0650 |
3.4085 | 79.0 | 553 | 4.0405 | 0.81 | 0.2910 | 1.1716 | 0.81 | 0.7892 | 0.1451 | 0.0650 |
3.4085 | 80.0 | 560 | 4.0476 | 0.81 | 0.2917 | 1.1743 | 0.81 | 0.7892 | 0.1453 | 0.0650 |
3.4085 | 81.0 | 567 | 4.0529 | 0.805 | 0.2921 | 1.1736 | 0.805 | 0.7833 | 0.1573 | 0.0654 |
3.4085 | 82.0 | 574 | 4.0570 | 0.805 | 0.2919 | 1.1741 | 0.805 | 0.7861 | 0.1717 | 0.0655 |
3.4085 | 83.0 | 581 | 4.0601 | 0.81 | 0.2918 | 1.1727 | 0.81 | 0.7892 | 0.1508 | 0.0650 |
3.4085 | 84.0 | 588 | 4.0643 | 0.81 | 0.2919 | 1.1743 | 0.81 | 0.7892 | 0.1507 | 0.0652 |
3.4085 | 85.0 | 595 | 4.0678 | 0.81 | 0.2922 | 1.1744 | 0.81 | 0.7892 | 0.1552 | 0.0651 |
3.4085 | 86.0 | 602 | 4.0743 | 0.81 | 0.2925 | 1.1746 | 0.81 | 0.7892 | 0.1526 | 0.0651 |
3.4085 | 87.0 | 609 | 4.0758 | 0.805 | 0.2924 | 1.1753 | 0.805 | 0.7833 | 0.1718 | 0.0653 |
3.4085 | 88.0 | 616 | 4.0796 | 0.805 | 0.2924 | 1.1758 | 0.805 | 0.7833 | 0.1567 | 0.0654 |
3.4085 | 89.0 | 623 | 4.0803 | 0.81 | 0.2920 | 1.1742 | 0.81 | 0.7892 | 0.1587 | 0.0650 |
3.4085 | 90.0 | 630 | 4.0842 | 0.81 | 0.2925 | 1.1744 | 0.81 | 0.7892 | 0.1529 | 0.0651 |
3.4085 | 91.0 | 637 | 4.0864 | 0.805 | 0.2926 | 1.1752 | 0.805 | 0.7833 | 0.1568 | 0.0654 |
3.4085 | 92.0 | 644 | 4.0880 | 0.81 | 0.2925 | 1.1757 | 0.81 | 0.7892 | 0.1526 | 0.0651 |
3.4085 | 93.0 | 651 | 4.0903 | 0.805 | 0.2927 | 1.1752 | 0.805 | 0.7833 | 0.1567 | 0.0654 |
3.4085 | 94.0 | 658 | 4.0918 | 0.805 | 0.2927 | 1.1750 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
3.4085 | 95.0 | 665 | 4.0927 | 0.805 | 0.2926 | 1.1750 | 0.805 | 0.7833 | 0.1570 | 0.0655 |
3.4085 | 96.0 | 672 | 4.0937 | 0.805 | 0.2927 | 1.1751 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
3.4085 | 97.0 | 679 | 4.0946 | 0.805 | 0.2926 | 1.1750 | 0.805 | 0.7833 | 0.1573 | 0.0655 |
3.4085 | 98.0 | 686 | 4.0950 | 0.805 | 0.2926 | 1.1752 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
3.4085 | 99.0 | 693 | 4.0955 | 0.805 | 0.2927 | 1.1753 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
3.4085 | 100.0 | 700 | 4.0957 | 0.805 | 0.2927 | 1.1753 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2