<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
60-tiny_tobacco3482_hint_
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 63.5396
- Accuracy: 0.84
- Brier Loss: 0.3043
- Nll: 1.1495
- F1 Micro: 0.8400
- F1 Macro: 0.8244
- Ece: 0.1568
- Aurc: 0.0457
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 25 | 66.6267 | 0.26 | 0.8706 | 4.9000 | 0.26 | 0.1920 | 0.2904 | 0.7812 |
No log | 2.0 | 50 | 65.8408 | 0.54 | 0.5951 | 2.8543 | 0.54 | 0.4535 | 0.2542 | 0.2567 |
No log | 3.0 | 75 | 65.3708 | 0.675 | 0.4400 | 1.6094 | 0.675 | 0.6134 | 0.2395 | 0.1333 |
No log | 4.0 | 100 | 65.0889 | 0.76 | 0.3809 | 1.5505 | 0.76 | 0.7422 | 0.2333 | 0.1125 |
No log | 5.0 | 125 | 64.7800 | 0.8 | 0.3080 | 1.7523 | 0.8000 | 0.7663 | 0.1734 | 0.0708 |
No log | 6.0 | 150 | 64.6296 | 0.78 | 0.3286 | 1.7771 | 0.78 | 0.7427 | 0.1752 | 0.0642 |
No log | 7.0 | 175 | 64.3879 | 0.765 | 0.3584 | 1.7400 | 0.765 | 0.6986 | 0.1799 | 0.0937 |
No log | 8.0 | 200 | 64.4361 | 0.72 | 0.4640 | 1.4368 | 0.72 | 0.7385 | 0.2350 | 0.1314 |
No log | 9.0 | 225 | 64.2223 | 0.76 | 0.3846 | 1.6420 | 0.76 | 0.7417 | 0.2006 | 0.0915 |
No log | 10.0 | 250 | 64.2618 | 0.725 | 0.4268 | 1.6667 | 0.7250 | 0.7132 | 0.2131 | 0.1110 |
No log | 11.0 | 275 | 64.2839 | 0.7 | 0.4830 | 1.7975 | 0.7 | 0.6829 | 0.2213 | 0.1207 |
No log | 12.0 | 300 | 64.0218 | 0.785 | 0.3523 | 1.7098 | 0.785 | 0.7363 | 0.1742 | 0.0702 |
No log | 13.0 | 325 | 63.8071 | 0.78 | 0.3218 | 1.4587 | 0.78 | 0.7574 | 0.1640 | 0.0674 |
No log | 14.0 | 350 | 64.7387 | 0.645 | 0.5871 | 2.0188 | 0.645 | 0.6360 | 0.2996 | 0.1765 |
No log | 15.0 | 375 | 64.2173 | 0.765 | 0.3832 | 1.8093 | 0.765 | 0.6909 | 0.1978 | 0.0892 |
No log | 16.0 | 400 | 64.2233 | 0.765 | 0.3897 | 1.4456 | 0.765 | 0.7432 | 0.1983 | 0.0805 |
No log | 17.0 | 425 | 63.7977 | 0.825 | 0.2971 | 1.4248 | 0.825 | 0.8057 | 0.1583 | 0.0546 |
No log | 18.0 | 450 | 63.5818 | 0.82 | 0.2983 | 1.3079 | 0.82 | 0.7936 | 0.1532 | 0.0474 |
No log | 19.0 | 475 | 64.1935 | 0.78 | 0.3764 | 1.6662 | 0.78 | 0.7618 | 0.1911 | 0.0669 |
63.0313 | 20.0 | 500 | 63.6054 | 0.825 | 0.2871 | 1.4054 | 0.825 | 0.8118 | 0.1605 | 0.0520 |
63.0313 | 21.0 | 525 | 63.6316 | 0.79 | 0.3258 | 1.3131 | 0.79 | 0.7714 | 0.1632 | 0.0485 |
63.0313 | 22.0 | 550 | 63.6978 | 0.84 | 0.2935 | 1.2425 | 0.8400 | 0.8236 | 0.1508 | 0.0586 |
63.0313 | 23.0 | 575 | 63.8266 | 0.825 | 0.3117 | 1.5766 | 0.825 | 0.8019 | 0.1550 | 0.0554 |
63.0313 | 24.0 | 600 | 63.6750 | 0.825 | 0.3130 | 1.1848 | 0.825 | 0.8158 | 0.1553 | 0.0462 |
63.0313 | 25.0 | 625 | 63.8469 | 0.82 | 0.3259 | 1.3997 | 0.82 | 0.8007 | 0.1603 | 0.0564 |
63.0313 | 26.0 | 650 | 63.7656 | 0.815 | 0.3285 | 1.2752 | 0.815 | 0.7969 | 0.1656 | 0.0535 |
63.0313 | 27.0 | 675 | 63.8074 | 0.805 | 0.3455 | 1.1282 | 0.805 | 0.7870 | 0.1732 | 0.0542 |
63.0313 | 28.0 | 700 | 63.8411 | 0.81 | 0.3437 | 1.1501 | 0.81 | 0.7917 | 0.1759 | 0.0529 |
63.0313 | 29.0 | 725 | 63.8158 | 0.81 | 0.3345 | 1.1519 | 0.81 | 0.7901 | 0.1706 | 0.0544 |
63.0313 | 30.0 | 750 | 63.7917 | 0.815 | 0.3383 | 1.2013 | 0.815 | 0.8006 | 0.1706 | 0.0557 |
63.0313 | 31.0 | 775 | 63.7855 | 0.815 | 0.3396 | 1.2088 | 0.815 | 0.7974 | 0.1687 | 0.0551 |
63.0313 | 32.0 | 800 | 63.8003 | 0.825 | 0.3297 | 1.2233 | 0.825 | 0.8091 | 0.1694 | 0.0547 |
63.0313 | 33.0 | 825 | 63.8029 | 0.815 | 0.3405 | 1.2628 | 0.815 | 0.8007 | 0.1729 | 0.0547 |
63.0313 | 34.0 | 850 | 63.7752 | 0.81 | 0.3352 | 1.2587 | 0.81 | 0.7979 | 0.1727 | 0.0574 |
63.0313 | 35.0 | 875 | 63.7800 | 0.815 | 0.3346 | 1.1948 | 0.815 | 0.7977 | 0.1679 | 0.0560 |
63.0313 | 36.0 | 900 | 63.7885 | 0.825 | 0.3313 | 1.2728 | 0.825 | 0.8173 | 0.1591 | 0.0569 |
63.0313 | 37.0 | 925 | 63.7730 | 0.815 | 0.3354 | 1.2726 | 0.815 | 0.8027 | 0.1689 | 0.0555 |
63.0313 | 38.0 | 950 | 63.8327 | 0.815 | 0.3405 | 1.4350 | 0.815 | 0.8043 | 0.1675 | 0.0632 |
63.0313 | 39.0 | 975 | 63.7324 | 0.785 | 0.3686 | 1.6439 | 0.785 | 0.7745 | 0.1808 | 0.0666 |
61.6786 | 40.0 | 1000 | 63.8625 | 0.765 | 0.3946 | 1.6127 | 0.765 | 0.7727 | 0.1961 | 0.0723 |
61.6786 | 41.0 | 1025 | 64.1254 | 0.765 | 0.3904 | 1.5456 | 0.765 | 0.7570 | 0.2020 | 0.0850 |
61.6786 | 42.0 | 1050 | 63.6201 | 0.78 | 0.3728 | 1.4198 | 0.78 | 0.7447 | 0.1869 | 0.0647 |
61.6786 | 43.0 | 1075 | 63.6033 | 0.835 | 0.2968 | 1.5430 | 0.835 | 0.8059 | 0.1574 | 0.0479 |
61.6786 | 44.0 | 1100 | 63.6777 | 0.795 | 0.3606 | 1.3542 | 0.795 | 0.7638 | 0.1806 | 0.0529 |
61.6786 | 45.0 | 1125 | 63.5747 | 0.83 | 0.2996 | 1.5403 | 0.83 | 0.8079 | 0.1450 | 0.0504 |
61.6786 | 46.0 | 1150 | 63.6022 | 0.805 | 0.3389 | 1.3842 | 0.805 | 0.7791 | 0.1794 | 0.0466 |
61.6786 | 47.0 | 1175 | 63.6342 | 0.81 | 0.3346 | 1.2861 | 0.81 | 0.7811 | 0.1678 | 0.0476 |
61.6786 | 48.0 | 1200 | 63.6065 | 0.81 | 0.3298 | 1.2911 | 0.81 | 0.7807 | 0.1654 | 0.0465 |
61.6786 | 49.0 | 1225 | 63.5937 | 0.815 | 0.3260 | 1.3576 | 0.815 | 0.7844 | 0.1613 | 0.0467 |
61.6786 | 50.0 | 1250 | 63.6029 | 0.815 | 0.3241 | 1.2826 | 0.815 | 0.7844 | 0.1662 | 0.0467 |
61.6786 | 51.0 | 1275 | 63.5947 | 0.81 | 0.3232 | 1.4156 | 0.81 | 0.7789 | 0.1631 | 0.0471 |
61.6786 | 52.0 | 1300 | 63.6501 | 0.81 | 0.3268 | 1.4148 | 0.81 | 0.7785 | 0.1703 | 0.0468 |
61.6786 | 53.0 | 1325 | 63.6207 | 0.81 | 0.3207 | 1.2785 | 0.81 | 0.7785 | 0.1698 | 0.0479 |
61.6786 | 54.0 | 1350 | 63.6021 | 0.815 | 0.3233 | 1.3519 | 0.815 | 0.7818 | 0.1629 | 0.0456 |
61.6786 | 55.0 | 1375 | 63.6128 | 0.815 | 0.3207 | 1.2837 | 0.815 | 0.7818 | 0.1641 | 0.0474 |
61.6786 | 56.0 | 1400 | 63.5974 | 0.81 | 0.3194 | 1.3542 | 0.81 | 0.7789 | 0.1679 | 0.0474 |
61.6786 | 57.0 | 1425 | 63.6173 | 0.81 | 0.3260 | 1.2907 | 0.81 | 0.7761 | 0.1653 | 0.0486 |
61.6786 | 58.0 | 1450 | 63.6057 | 0.81 | 0.3163 | 1.2981 | 0.81 | 0.7789 | 0.1651 | 0.0471 |
61.6786 | 59.0 | 1475 | 63.6052 | 0.81 | 0.3197 | 1.3444 | 0.81 | 0.7789 | 0.1680 | 0.0467 |
61.52 | 60.0 | 1500 | 63.5865 | 0.82 | 0.3143 | 1.2748 | 0.82 | 0.7920 | 0.1617 | 0.0465 |
61.52 | 61.0 | 1525 | 63.5754 | 0.82 | 0.3126 | 1.2677 | 0.82 | 0.7920 | 0.1595 | 0.0468 |
61.52 | 62.0 | 1550 | 63.5876 | 0.815 | 0.3120 | 1.2691 | 0.815 | 0.7879 | 0.1567 | 0.0478 |
61.52 | 63.0 | 1575 | 63.6040 | 0.82 | 0.3110 | 1.2632 | 0.82 | 0.7920 | 0.1526 | 0.0472 |
61.52 | 64.0 | 1600 | 63.5956 | 0.82 | 0.3111 | 1.1976 | 0.82 | 0.7963 | 0.1592 | 0.0468 |
61.52 | 65.0 | 1625 | 63.5792 | 0.815 | 0.3095 | 1.1928 | 0.815 | 0.7879 | 0.1571 | 0.0469 |
61.52 | 66.0 | 1650 | 63.5704 | 0.82 | 0.3086 | 1.2509 | 0.82 | 0.7936 | 0.1543 | 0.0467 |
61.52 | 67.0 | 1675 | 63.5918 | 0.82 | 0.3118 | 1.2536 | 0.82 | 0.7936 | 0.1619 | 0.0471 |
61.52 | 68.0 | 1700 | 63.5741 | 0.82 | 0.3072 | 1.2491 | 0.82 | 0.7963 | 0.1562 | 0.0465 |
61.52 | 69.0 | 1725 | 63.5581 | 0.825 | 0.3085 | 1.2490 | 0.825 | 0.8021 | 0.1566 | 0.0460 |
61.52 | 70.0 | 1750 | 63.5796 | 0.82 | 0.3087 | 1.2456 | 0.82 | 0.7963 | 0.1556 | 0.0471 |
61.52 | 71.0 | 1775 | 63.5776 | 0.825 | 0.3073 | 1.2530 | 0.825 | 0.8021 | 0.1571 | 0.0474 |
61.52 | 72.0 | 1800 | 63.5524 | 0.825 | 0.3064 | 1.2402 | 0.825 | 0.8021 | 0.1555 | 0.0465 |
61.52 | 73.0 | 1825 | 63.5638 | 0.825 | 0.3075 | 1.2465 | 0.825 | 0.8021 | 0.1607 | 0.0466 |
61.52 | 74.0 | 1850 | 63.5654 | 0.82 | 0.3058 | 1.2425 | 0.82 | 0.7963 | 0.1552 | 0.0468 |
61.52 | 75.0 | 1875 | 63.5654 | 0.825 | 0.3041 | 1.2439 | 0.825 | 0.8021 | 0.1563 | 0.0466 |
61.52 | 76.0 | 1900 | 63.5499 | 0.83 | 0.3018 | 1.2432 | 0.83 | 0.8082 | 0.1541 | 0.0463 |
61.52 | 77.0 | 1925 | 63.5563 | 0.825 | 0.3059 | 1.2385 | 0.825 | 0.8021 | 0.1570 | 0.0466 |
61.52 | 78.0 | 1950 | 63.5524 | 0.825 | 0.3045 | 1.2364 | 0.825 | 0.8021 | 0.1524 | 0.0464 |
61.52 | 79.0 | 1975 | 63.5507 | 0.825 | 0.3064 | 1.2344 | 0.825 | 0.8021 | 0.1523 | 0.0463 |
61.4257 | 80.0 | 2000 | 63.5531 | 0.825 | 0.3062 | 1.2266 | 0.825 | 0.8035 | 0.1625 | 0.0463 |
61.4257 | 81.0 | 2025 | 63.5486 | 0.825 | 0.3029 | 1.1850 | 0.825 | 0.8024 | 0.1506 | 0.0463 |
61.4257 | 82.0 | 2050 | 63.5479 | 0.82 | 0.3081 | 1.2269 | 0.82 | 0.7963 | 0.1588 | 0.0458 |
61.4257 | 83.0 | 2075 | 63.5444 | 0.835 | 0.3029 | 1.1721 | 0.835 | 0.8139 | 0.1475 | 0.0461 |
61.4257 | 84.0 | 2100 | 63.5435 | 0.835 | 0.3047 | 1.2306 | 0.835 | 0.8171 | 0.1529 | 0.0464 |
61.4257 | 85.0 | 2125 | 63.5393 | 0.83 | 0.3058 | 1.2255 | 0.83 | 0.8081 | 0.1462 | 0.0464 |
61.4257 | 86.0 | 2150 | 63.5437 | 0.835 | 0.3048 | 1.2254 | 0.835 | 0.8171 | 0.1481 | 0.0464 |
61.4257 | 87.0 | 2175 | 63.5463 | 0.83 | 0.3039 | 1.1549 | 0.83 | 0.8115 | 0.1562 | 0.0463 |
61.4257 | 88.0 | 2200 | 63.5408 | 0.835 | 0.3055 | 1.2211 | 0.835 | 0.8187 | 0.1485 | 0.0462 |
61.4257 | 89.0 | 2225 | 63.5477 | 0.825 | 0.3054 | 1.1541 | 0.825 | 0.8024 | 0.1521 | 0.0463 |
61.4257 | 90.0 | 2250 | 63.5383 | 0.83 | 0.3051 | 1.1577 | 0.83 | 0.8095 | 0.1532 | 0.0463 |
61.4257 | 91.0 | 2275 | 63.5466 | 0.84 | 0.3057 | 1.1583 | 0.8400 | 0.8244 | 0.1516 | 0.0458 |
61.4257 | 92.0 | 2300 | 63.5447 | 0.835 | 0.3049 | 1.1518 | 0.835 | 0.8188 | 0.1615 | 0.0462 |
61.4257 | 93.0 | 2325 | 63.5327 | 0.84 | 0.3044 | 1.1540 | 0.8400 | 0.8244 | 0.1508 | 0.0459 |
61.4257 | 94.0 | 2350 | 63.5392 | 0.84 | 0.3046 | 1.1506 | 0.8400 | 0.8244 | 0.1569 | 0.0459 |
61.4257 | 95.0 | 2375 | 63.5305 | 0.835 | 0.3050 | 1.1520 | 0.835 | 0.8188 | 0.1571 | 0.0457 |
61.4257 | 96.0 | 2400 | 63.5413 | 0.835 | 0.3042 | 1.1494 | 0.835 | 0.8188 | 0.1571 | 0.0461 |
61.4257 | 97.0 | 2425 | 63.5387 | 0.835 | 0.3047 | 1.1489 | 0.835 | 0.8188 | 0.1652 | 0.0461 |
61.4257 | 98.0 | 2450 | 63.5383 | 0.84 | 0.3046 | 1.1503 | 0.8400 | 0.8244 | 0.1568 | 0.0458 |
61.4257 | 99.0 | 2475 | 63.5374 | 0.835 | 0.3045 | 1.1489 | 0.835 | 0.8188 | 0.1570 | 0.0456 |
61.3919 | 100.0 | 2500 | 63.5396 | 0.84 | 0.3043 | 1.1495 | 0.8400 | 0.8244 | 0.1568 | 0.0457 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2