<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
vit-small_tobacco3482_hint_
This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 56.9670
- Accuracy: 0.835
- Brier Loss: 0.2969
- Nll: 1.1900
- F1 Micro: 0.835
- F1 Macro: 0.8377
- Ece: 0.1545
- Aurc: 0.0499
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 25 | 59.9015 | 0.37 | 0.7765 | 4.3056 | 0.37 | 0.2185 | 0.2975 | 0.4588 |
No log | 2.0 | 50 | 58.9173 | 0.66 | 0.4866 | 2.0758 | 0.66 | 0.5717 | 0.2732 | 0.1578 |
No log | 3.0 | 75 | 58.3604 | 0.745 | 0.3466 | 1.5077 | 0.745 | 0.7135 | 0.1846 | 0.0854 |
No log | 4.0 | 100 | 58.0585 | 0.75 | 0.3628 | 1.5044 | 0.75 | 0.7674 | 0.2058 | 0.1052 |
No log | 5.0 | 125 | 57.8363 | 0.76 | 0.3782 | 1.7066 | 0.76 | 0.7657 | 0.2174 | 0.1039 |
No log | 6.0 | 150 | 57.4894 | 0.75 | 0.3593 | 1.5137 | 0.75 | 0.7377 | 0.1724 | 0.0800 |
No log | 7.0 | 175 | 57.5188 | 0.76 | 0.3631 | 1.9770 | 0.76 | 0.7514 | 0.1968 | 0.0874 |
No log | 8.0 | 200 | 57.4349 | 0.74 | 0.3947 | 1.8766 | 0.74 | 0.7412 | 0.1753 | 0.0777 |
No log | 9.0 | 225 | 57.1764 | 0.765 | 0.3481 | 1.1532 | 0.765 | 0.7411 | 0.1956 | 0.0829 |
No log | 10.0 | 250 | 57.6192 | 0.765 | 0.3943 | 1.8998 | 0.765 | 0.7755 | 0.1850 | 0.0981 |
No log | 11.0 | 275 | 57.2121 | 0.77 | 0.3531 | 1.2685 | 0.7700 | 0.7643 | 0.1739 | 0.0689 |
No log | 12.0 | 300 | 57.2250 | 0.795 | 0.3279 | 1.7553 | 0.795 | 0.7816 | 0.1596 | 0.0660 |
No log | 13.0 | 325 | 57.4911 | 0.785 | 0.3678 | 2.0499 | 0.785 | 0.7857 | 0.1788 | 0.0945 |
No log | 14.0 | 350 | 57.1481 | 0.77 | 0.3542 | 1.4834 | 0.7700 | 0.7649 | 0.1892 | 0.0636 |
No log | 15.0 | 375 | 57.1701 | 0.825 | 0.3041 | 1.6075 | 0.825 | 0.8223 | 0.1609 | 0.0621 |
No log | 16.0 | 400 | 57.4059 | 0.805 | 0.3343 | 1.7348 | 0.805 | 0.8080 | 0.1654 | 0.0822 |
No log | 17.0 | 425 | 57.9813 | 0.72 | 0.4616 | 2.6345 | 0.72 | 0.7252 | 0.2263 | 0.1101 |
No log | 18.0 | 450 | 57.2677 | 0.825 | 0.2953 | 1.5836 | 0.825 | 0.8171 | 0.1590 | 0.0572 |
No log | 19.0 | 475 | 57.6052 | 0.765 | 0.4023 | 1.7463 | 0.765 | 0.7333 | 0.2052 | 0.0822 |
57.2084 | 20.0 | 500 | 57.4249 | 0.79 | 0.3653 | 1.5564 | 0.79 | 0.7941 | 0.1818 | 0.0845 |
57.2084 | 21.0 | 525 | 57.2631 | 0.845 | 0.2704 | 1.7326 | 0.845 | 0.8312 | 0.1358 | 0.0628 |
57.2084 | 22.0 | 550 | 57.1520 | 0.845 | 0.2723 | 1.2743 | 0.845 | 0.8386 | 0.1402 | 0.0551 |
57.2084 | 23.0 | 575 | 57.2977 | 0.82 | 0.3137 | 1.3068 | 0.82 | 0.8029 | 0.1578 | 0.0621 |
57.2084 | 24.0 | 600 | 57.2030 | 0.81 | 0.3107 | 1.5814 | 0.81 | 0.7870 | 0.1594 | 0.0688 |
57.2084 | 25.0 | 625 | 57.1500 | 0.82 | 0.3027 | 1.4128 | 0.82 | 0.8229 | 0.1584 | 0.0436 |
57.2084 | 26.0 | 650 | 57.1619 | 0.855 | 0.2735 | 1.5164 | 0.855 | 0.8558 | 0.1404 | 0.0530 |
57.2084 | 27.0 | 675 | 57.1504 | 0.845 | 0.2832 | 1.5742 | 0.845 | 0.8507 | 0.1500 | 0.0516 |
57.2084 | 28.0 | 700 | 57.1829 | 0.835 | 0.2932 | 1.4010 | 0.835 | 0.8410 | 0.1489 | 0.0496 |
57.2084 | 29.0 | 725 | 57.1899 | 0.83 | 0.2953 | 1.4038 | 0.83 | 0.8338 | 0.1497 | 0.0511 |
57.2084 | 30.0 | 750 | 57.1644 | 0.835 | 0.2948 | 1.3923 | 0.835 | 0.8374 | 0.1509 | 0.0507 |
57.2084 | 31.0 | 775 | 57.1720 | 0.83 | 0.2958 | 1.4622 | 0.83 | 0.8296 | 0.1502 | 0.0509 |
57.2084 | 32.0 | 800 | 57.1365 | 0.835 | 0.3024 | 1.2976 | 0.835 | 0.8374 | 0.1575 | 0.0509 |
57.2084 | 33.0 | 825 | 57.1499 | 0.835 | 0.2995 | 1.3654 | 0.835 | 0.8308 | 0.1574 | 0.0523 |
57.2084 | 34.0 | 850 | 57.1064 | 0.83 | 0.3022 | 1.3606 | 0.83 | 0.8251 | 0.1578 | 0.0526 |
57.2084 | 35.0 | 875 | 57.0901 | 0.835 | 0.3003 | 1.2803 | 0.835 | 0.8336 | 0.1554 | 0.0516 |
57.2084 | 36.0 | 900 | 57.0922 | 0.835 | 0.3047 | 1.2749 | 0.835 | 0.8336 | 0.1571 | 0.0517 |
57.2084 | 37.0 | 925 | 57.0673 | 0.83 | 0.3034 | 1.2533 | 0.83 | 0.8344 | 0.1559 | 0.0509 |
57.2084 | 38.0 | 950 | 57.0810 | 0.83 | 0.3024 | 1.2718 | 0.83 | 0.8344 | 0.1620 | 0.0526 |
57.2084 | 39.0 | 975 | 57.1040 | 0.835 | 0.3041 | 1.2522 | 0.835 | 0.8392 | 0.1571 | 0.0506 |
56.1387 | 40.0 | 1000 | 57.0542 | 0.835 | 0.3024 | 1.3210 | 0.835 | 0.8392 | 0.1525 | 0.0501 |
56.1387 | 41.0 | 1025 | 57.0554 | 0.83 | 0.3037 | 1.3231 | 0.83 | 0.8344 | 0.1534 | 0.0508 |
56.1387 | 42.0 | 1050 | 57.0724 | 0.83 | 0.2989 | 1.2517 | 0.83 | 0.8344 | 0.1485 | 0.0495 |
56.1387 | 43.0 | 1075 | 57.0429 | 0.835 | 0.3010 | 1.3082 | 0.835 | 0.8401 | 0.1557 | 0.0506 |
56.1387 | 44.0 | 1100 | 57.0208 | 0.835 | 0.3001 | 1.2428 | 0.835 | 0.8392 | 0.1583 | 0.0496 |
56.1387 | 45.0 | 1125 | 57.0700 | 0.835 | 0.2996 | 1.3149 | 0.835 | 0.8454 | 0.1601 | 0.0509 |
56.1387 | 46.0 | 1150 | 57.0054 | 0.835 | 0.2950 | 1.3019 | 0.835 | 0.8407 | 0.1476 | 0.0492 |
56.1387 | 47.0 | 1175 | 57.0516 | 0.825 | 0.3000 | 1.2344 | 0.825 | 0.8317 | 0.1485 | 0.0511 |
56.1387 | 48.0 | 1200 | 57.0373 | 0.835 | 0.3008 | 1.3016 | 0.835 | 0.8434 | 0.1611 | 0.0498 |
56.1387 | 49.0 | 1225 | 57.0154 | 0.83 | 0.2982 | 1.2376 | 0.83 | 0.8329 | 0.1515 | 0.0501 |
56.1387 | 50.0 | 1250 | 57.0000 | 0.835 | 0.2982 | 1.2196 | 0.835 | 0.8434 | 0.1535 | 0.0493 |
56.1387 | 51.0 | 1275 | 57.0054 | 0.825 | 0.2987 | 1.2217 | 0.825 | 0.8352 | 0.1517 | 0.0505 |
56.1387 | 52.0 | 1300 | 57.0347 | 0.835 | 0.2996 | 1.2239 | 0.835 | 0.8407 | 0.1643 | 0.0486 |
56.1387 | 53.0 | 1325 | 57.0183 | 0.835 | 0.2989 | 1.2208 | 0.835 | 0.8411 | 0.1604 | 0.0495 |
56.1387 | 54.0 | 1350 | 57.0094 | 0.845 | 0.2925 | 1.1545 | 0.845 | 0.8494 | 0.1515 | 0.0486 |
56.1387 | 55.0 | 1375 | 57.0027 | 0.83 | 0.2974 | 1.2161 | 0.83 | 0.8380 | 0.1538 | 0.0491 |
56.1387 | 56.0 | 1400 | 57.0060 | 0.835 | 0.2975 | 1.2215 | 0.835 | 0.8407 | 0.1546 | 0.0505 |
56.1387 | 57.0 | 1425 | 56.9898 | 0.835 | 0.2959 | 1.1432 | 0.835 | 0.8411 | 0.1483 | 0.0501 |
56.1387 | 58.0 | 1450 | 56.9907 | 0.835 | 0.2963 | 1.1437 | 0.835 | 0.8406 | 0.1527 | 0.0485 |
56.1387 | 59.0 | 1475 | 56.9578 | 0.84 | 0.2935 | 1.1583 | 0.8400 | 0.8439 | 0.1513 | 0.0488 |
55.9877 | 60.0 | 1500 | 57.0032 | 0.84 | 0.2957 | 1.2160 | 0.8400 | 0.8439 | 0.1460 | 0.0502 |
55.9877 | 61.0 | 1525 | 56.9880 | 0.835 | 0.2990 | 1.2836 | 0.835 | 0.8406 | 0.1475 | 0.0489 |
55.9877 | 62.0 | 1550 | 56.9920 | 0.83 | 0.2973 | 1.2071 | 0.83 | 0.8349 | 0.1519 | 0.0494 |
55.9877 | 63.0 | 1575 | 56.9681 | 0.835 | 0.2978 | 1.2076 | 0.835 | 0.8406 | 0.1465 | 0.0483 |
55.9877 | 64.0 | 1600 | 56.9772 | 0.835 | 0.3003 | 1.1997 | 0.835 | 0.8406 | 0.1567 | 0.0489 |
55.9877 | 65.0 | 1625 | 56.9705 | 0.835 | 0.2973 | 1.2038 | 0.835 | 0.8406 | 0.1520 | 0.0495 |
55.9877 | 66.0 | 1650 | 56.9682 | 0.835 | 0.2977 | 1.2005 | 0.835 | 0.8406 | 0.1576 | 0.0488 |
55.9877 | 67.0 | 1675 | 56.9775 | 0.835 | 0.2981 | 1.2093 | 0.835 | 0.8406 | 0.1497 | 0.0501 |
55.9877 | 68.0 | 1700 | 56.9762 | 0.835 | 0.2989 | 1.2061 | 0.835 | 0.8406 | 0.1626 | 0.0491 |
55.9877 | 69.0 | 1725 | 56.9807 | 0.84 | 0.2978 | 1.2023 | 0.8400 | 0.8434 | 0.1503 | 0.0481 |
55.9877 | 70.0 | 1750 | 56.9705 | 0.835 | 0.2988 | 1.1987 | 0.835 | 0.8406 | 0.1564 | 0.0487 |
55.9877 | 71.0 | 1775 | 56.9752 | 0.83 | 0.2987 | 1.2027 | 0.83 | 0.8349 | 0.1593 | 0.0497 |
55.9877 | 72.0 | 1800 | 56.9957 | 0.83 | 0.2996 | 1.2060 | 0.83 | 0.8349 | 0.1607 | 0.0496 |
55.9877 | 73.0 | 1825 | 56.9697 | 0.84 | 0.2966 | 1.1977 | 0.8400 | 0.8434 | 0.1510 | 0.0487 |
55.9877 | 74.0 | 1850 | 56.9644 | 0.83 | 0.2997 | 1.2055 | 0.83 | 0.8349 | 0.1528 | 0.0506 |
55.9877 | 75.0 | 1875 | 56.9677 | 0.84 | 0.2968 | 1.1969 | 0.8400 | 0.8434 | 0.1536 | 0.0495 |
55.9877 | 76.0 | 1900 | 56.9609 | 0.84 | 0.2958 | 1.1921 | 0.8400 | 0.8434 | 0.1531 | 0.0495 |
55.9877 | 77.0 | 1925 | 56.9663 | 0.835 | 0.2965 | 1.1950 | 0.835 | 0.8406 | 0.1576 | 0.0494 |
55.9877 | 78.0 | 1950 | 56.9796 | 0.83 | 0.2968 | 1.2049 | 0.83 | 0.8349 | 0.1525 | 0.0496 |
55.9877 | 79.0 | 1975 | 56.9648 | 0.835 | 0.2966 | 1.1944 | 0.835 | 0.8406 | 0.1545 | 0.0494 |
55.9237 | 80.0 | 2000 | 56.9596 | 0.845 | 0.2944 | 1.1912 | 0.845 | 0.8480 | 0.1543 | 0.0492 |
55.9237 | 81.0 | 2025 | 56.9596 | 0.84 | 0.2951 | 1.1878 | 0.8400 | 0.8434 | 0.1546 | 0.0492 |
55.9237 | 82.0 | 2050 | 56.9737 | 0.84 | 0.2958 | 1.1954 | 0.8400 | 0.8434 | 0.1521 | 0.0498 |
55.9237 | 83.0 | 2075 | 56.9725 | 0.835 | 0.2974 | 1.1963 | 0.835 | 0.8377 | 0.1512 | 0.0500 |
55.9237 | 84.0 | 2100 | 56.9743 | 0.835 | 0.2978 | 1.1928 | 0.835 | 0.8406 | 0.1554 | 0.0500 |
55.9237 | 85.0 | 2125 | 56.9788 | 0.835 | 0.2971 | 1.1952 | 0.835 | 0.8377 | 0.1493 | 0.0500 |
55.9237 | 86.0 | 2150 | 56.9705 | 0.84 | 0.2968 | 1.1933 | 0.8400 | 0.8434 | 0.1541 | 0.0499 |
55.9237 | 87.0 | 2175 | 56.9684 | 0.835 | 0.2966 | 1.1926 | 0.835 | 0.8377 | 0.1517 | 0.0497 |
55.9237 | 88.0 | 2200 | 56.9725 | 0.835 | 0.2979 | 1.1934 | 0.835 | 0.8377 | 0.1548 | 0.0497 |
55.9237 | 89.0 | 2225 | 56.9704 | 0.84 | 0.2959 | 1.1934 | 0.8400 | 0.8434 | 0.1527 | 0.0495 |
55.9237 | 90.0 | 2250 | 56.9681 | 0.84 | 0.2950 | 1.1907 | 0.8400 | 0.8434 | 0.1503 | 0.0498 |
55.9237 | 91.0 | 2275 | 56.9763 | 0.835 | 0.2979 | 1.1934 | 0.835 | 0.8377 | 0.1516 | 0.0501 |
55.9237 | 92.0 | 2300 | 56.9649 | 0.835 | 0.2959 | 1.1889 | 0.835 | 0.8377 | 0.1501 | 0.0495 |
55.9237 | 93.0 | 2325 | 56.9687 | 0.835 | 0.2959 | 1.1871 | 0.835 | 0.8377 | 0.1519 | 0.0501 |
55.9237 | 94.0 | 2350 | 56.9663 | 0.835 | 0.2963 | 1.1901 | 0.835 | 0.8377 | 0.1533 | 0.0496 |
55.9237 | 95.0 | 2375 | 56.9674 | 0.84 | 0.2955 | 1.1895 | 0.8400 | 0.8434 | 0.1534 | 0.0498 |
55.9237 | 96.0 | 2400 | 56.9661 | 0.835 | 0.2966 | 1.1907 | 0.835 | 0.8377 | 0.1520 | 0.0496 |
55.9237 | 97.0 | 2425 | 56.9623 | 0.84 | 0.2958 | 1.1871 | 0.8400 | 0.8434 | 0.1532 | 0.0499 |
55.9237 | 98.0 | 2450 | 56.9694 | 0.835 | 0.2969 | 1.1897 | 0.835 | 0.8377 | 0.1543 | 0.0499 |
55.9237 | 99.0 | 2475 | 56.9698 | 0.835 | 0.2967 | 1.1906 | 0.835 | 0.8377 | 0.1543 | 0.0499 |
55.8955 | 100.0 | 2500 | 56.9670 | 0.835 | 0.2969 | 1.1900 | 0.835 | 0.8377 | 0.1545 | 0.0499 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2