<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
81-tiny_tobacco3482_hint_
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 63.0326
- Accuracy: 0.85
- Brier Loss: 0.2647
- Nll: 1.1178
- F1 Micro: 0.85
- F1 Macro: 0.8409
- Ece: 0.1296
- Aurc: 0.0380
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 25 | 66.3150 | 0.26 | 0.8706 | 4.9002 | 0.26 | 0.1920 | 0.2904 | 0.7808 |
No log | 2.0 | 50 | 65.5312 | 0.54 | 0.5948 | 2.8544 | 0.54 | 0.4535 | 0.2938 | 0.2564 |
No log | 3.0 | 75 | 65.0613 | 0.675 | 0.4403 | 1.6101 | 0.675 | 0.6134 | 0.2276 | 0.1328 |
No log | 4.0 | 100 | 64.7775 | 0.765 | 0.3800 | 1.5512 | 0.765 | 0.7486 | 0.2394 | 0.1111 |
No log | 5.0 | 125 | 64.4695 | 0.8 | 0.3065 | 1.7477 | 0.8000 | 0.7658 | 0.1814 | 0.0713 |
No log | 6.0 | 150 | 64.3179 | 0.78 | 0.3286 | 1.7786 | 0.78 | 0.7409 | 0.1555 | 0.0636 |
No log | 7.0 | 175 | 64.0707 | 0.77 | 0.3516 | 1.6165 | 0.7700 | 0.7179 | 0.1677 | 0.0927 |
No log | 8.0 | 200 | 63.8878 | 0.775 | 0.3677 | 1.5868 | 0.775 | 0.7675 | 0.1922 | 0.0867 |
No log | 9.0 | 225 | 64.2141 | 0.715 | 0.4341 | 2.1772 | 0.715 | 0.7074 | 0.2273 | 0.1130 |
No log | 10.0 | 250 | 63.8037 | 0.775 | 0.3649 | 1.4471 | 0.775 | 0.7221 | 0.1744 | 0.0766 |
No log | 11.0 | 275 | 64.2412 | 0.655 | 0.5396 | 2.0448 | 0.655 | 0.6532 | 0.2692 | 0.1670 |
No log | 12.0 | 300 | 63.8514 | 0.73 | 0.4372 | 1.9358 | 0.7300 | 0.6997 | 0.2194 | 0.1008 |
No log | 13.0 | 325 | 63.4352 | 0.815 | 0.2888 | 1.6462 | 0.815 | 0.7858 | 0.1490 | 0.0469 |
No log | 14.0 | 350 | 63.5101 | 0.795 | 0.3262 | 1.3794 | 0.795 | 0.7816 | 0.1599 | 0.0648 |
No log | 15.0 | 375 | 63.6365 | 0.785 | 0.3704 | 1.4766 | 0.785 | 0.7627 | 0.1691 | 0.0781 |
No log | 16.0 | 400 | 63.7245 | 0.73 | 0.4145 | 1.7447 | 0.7300 | 0.7261 | 0.2087 | 0.0917 |
No log | 17.0 | 425 | 63.4312 | 0.795 | 0.3148 | 1.4363 | 0.795 | 0.7868 | 0.1665 | 0.0636 |
No log | 18.0 | 450 | 63.7070 | 0.835 | 0.2915 | 1.4455 | 0.835 | 0.8078 | 0.1542 | 0.0570 |
No log | 19.0 | 475 | 63.3600 | 0.81 | 0.2997 | 1.3326 | 0.81 | 0.7878 | 0.1602 | 0.0534 |
62.5692 | 20.0 | 500 | 63.4339 | 0.81 | 0.3158 | 1.2051 | 0.81 | 0.7809 | 0.1541 | 0.0506 |
62.5692 | 21.0 | 525 | 63.3477 | 0.805 | 0.3088 | 1.2575 | 0.805 | 0.7943 | 0.1570 | 0.0489 |
62.5692 | 22.0 | 550 | 63.3362 | 0.83 | 0.2911 | 1.3409 | 0.83 | 0.8104 | 0.1440 | 0.0514 |
62.5692 | 23.0 | 575 | 63.3897 | 0.805 | 0.3004 | 1.2505 | 0.805 | 0.7833 | 0.1595 | 0.0453 |
62.5692 | 24.0 | 600 | 63.3475 | 0.8 | 0.3185 | 1.1190 | 0.8000 | 0.7750 | 0.1626 | 0.0486 |
62.5692 | 25.0 | 625 | 63.4552 | 0.805 | 0.3470 | 1.2483 | 0.805 | 0.7904 | 0.1818 | 0.0652 |
62.5692 | 26.0 | 650 | 63.4364 | 0.79 | 0.3453 | 1.1298 | 0.79 | 0.7798 | 0.1827 | 0.0651 |
62.5692 | 27.0 | 675 | 63.3001 | 0.83 | 0.2899 | 1.4329 | 0.83 | 0.8141 | 0.1370 | 0.0466 |
62.5692 | 28.0 | 700 | 63.1848 | 0.85 | 0.2514 | 1.2175 | 0.85 | 0.8354 | 0.1220 | 0.0396 |
62.5692 | 29.0 | 725 | 63.2303 | 0.835 | 0.2744 | 1.4886 | 0.835 | 0.8143 | 0.1389 | 0.0532 |
62.5692 | 30.0 | 750 | 63.2275 | 0.84 | 0.2774 | 1.0426 | 0.8400 | 0.8405 | 0.1325 | 0.0632 |
62.5692 | 31.0 | 775 | 63.1341 | 0.835 | 0.2597 | 1.1066 | 0.835 | 0.8083 | 0.1170 | 0.0510 |
62.5692 | 32.0 | 800 | 63.2045 | 0.81 | 0.2970 | 1.0432 | 0.81 | 0.8028 | 0.1561 | 0.0552 |
62.5692 | 33.0 | 825 | 63.1898 | 0.82 | 0.2952 | 1.0189 | 0.82 | 0.8060 | 0.1440 | 0.0521 |
62.5692 | 34.0 | 850 | 63.1330 | 0.835 | 0.2760 | 1.0095 | 0.835 | 0.8265 | 0.1356 | 0.0542 |
62.5692 | 35.0 | 875 | 63.1572 | 0.84 | 0.2834 | 1.0174 | 0.8400 | 0.8234 | 0.1337 | 0.0508 |
62.5692 | 36.0 | 900 | 63.1922 | 0.835 | 0.2894 | 1.0102 | 0.835 | 0.8233 | 0.1469 | 0.0500 |
62.5692 | 37.0 | 925 | 63.1305 | 0.83 | 0.2818 | 1.0146 | 0.83 | 0.8172 | 0.1387 | 0.0510 |
62.5692 | 38.0 | 950 | 63.1902 | 0.815 | 0.2865 | 1.0101 | 0.815 | 0.8016 | 0.1500 | 0.0516 |
62.5692 | 39.0 | 975 | 63.1835 | 0.825 | 0.2851 | 1.0177 | 0.825 | 0.8162 | 0.1436 | 0.0496 |
61.2333 | 40.0 | 1000 | 63.1741 | 0.84 | 0.2783 | 1.0160 | 0.8400 | 0.8266 | 0.1275 | 0.0510 |
61.2333 | 41.0 | 1025 | 63.1755 | 0.835 | 0.2756 | 1.0117 | 0.835 | 0.8192 | 0.1447 | 0.0483 |
61.2333 | 42.0 | 1050 | 63.1281 | 0.83 | 0.2820 | 1.0142 | 0.83 | 0.8169 | 0.1415 | 0.0466 |
61.2333 | 43.0 | 1075 | 63.1697 | 0.85 | 0.2675 | 0.9929 | 0.85 | 0.8358 | 0.1423 | 0.0484 |
61.2333 | 44.0 | 1100 | 63.1141 | 0.835 | 0.2767 | 1.0005 | 0.835 | 0.8237 | 0.1293 | 0.0481 |
61.2333 | 45.0 | 1125 | 63.1441 | 0.85 | 0.2638 | 1.0023 | 0.85 | 0.8383 | 0.1335 | 0.0471 |
61.2333 | 46.0 | 1150 | 63.1221 | 0.84 | 0.2745 | 0.9981 | 0.8400 | 0.8308 | 0.1271 | 0.0451 |
61.2333 | 47.0 | 1175 | 63.1140 | 0.845 | 0.2654 | 0.9891 | 0.845 | 0.8317 | 0.1351 | 0.0458 |
61.2333 | 48.0 | 1200 | 63.1056 | 0.845 | 0.2654 | 1.0016 | 0.845 | 0.8351 | 0.1364 | 0.0458 |
61.2333 | 49.0 | 1225 | 63.0906 | 0.83 | 0.2713 | 1.0042 | 0.83 | 0.8221 | 0.1455 | 0.0449 |
61.2333 | 50.0 | 1250 | 63.0942 | 0.835 | 0.2633 | 1.0003 | 0.835 | 0.8314 | 0.1397 | 0.0452 |
61.2333 | 51.0 | 1275 | 63.0929 | 0.84 | 0.2641 | 0.9957 | 0.8400 | 0.8359 | 0.1340 | 0.0440 |
61.2333 | 52.0 | 1300 | 63.0913 | 0.83 | 0.2646 | 1.0040 | 0.83 | 0.8242 | 0.1422 | 0.0440 |
61.2333 | 53.0 | 1325 | 63.1152 | 0.83 | 0.2754 | 0.9985 | 0.83 | 0.8250 | 0.1416 | 0.0447 |
61.2333 | 54.0 | 1350 | 63.0923 | 0.835 | 0.2649 | 0.9997 | 0.835 | 0.8278 | 0.1356 | 0.0426 |
61.2333 | 55.0 | 1375 | 63.0720 | 0.83 | 0.2686 | 0.9988 | 0.83 | 0.8243 | 0.1396 | 0.0431 |
61.2333 | 56.0 | 1400 | 63.0627 | 0.83 | 0.2636 | 1.0713 | 0.83 | 0.8243 | 0.1369 | 0.0427 |
61.2333 | 57.0 | 1425 | 63.0742 | 0.835 | 0.2692 | 1.0572 | 0.835 | 0.8305 | 0.1391 | 0.0425 |
61.2333 | 58.0 | 1450 | 63.0910 | 0.84 | 0.2639 | 1.0727 | 0.8400 | 0.8334 | 0.1320 | 0.0432 |
61.2333 | 59.0 | 1475 | 63.1015 | 0.84 | 0.2648 | 1.1382 | 0.8400 | 0.8354 | 0.1331 | 0.0423 |
61.0482 | 60.0 | 1500 | 63.0557 | 0.835 | 0.2655 | 1.0688 | 0.835 | 0.8293 | 0.1333 | 0.0420 |
61.0482 | 61.0 | 1525 | 63.0590 | 0.835 | 0.2655 | 1.1378 | 0.835 | 0.8315 | 0.1425 | 0.0416 |
61.0482 | 62.0 | 1550 | 63.0732 | 0.845 | 0.2661 | 1.0565 | 0.845 | 0.8381 | 0.1404 | 0.0413 |
61.0482 | 63.0 | 1575 | 63.0972 | 0.855 | 0.2659 | 1.1274 | 0.855 | 0.8501 | 0.1424 | 0.0416 |
61.0482 | 64.0 | 1600 | 63.0528 | 0.84 | 0.2694 | 1.1315 | 0.8400 | 0.8330 | 0.1355 | 0.0418 |
61.0482 | 65.0 | 1625 | 63.0625 | 0.835 | 0.2683 | 1.1336 | 0.835 | 0.8281 | 0.1373 | 0.0411 |
61.0482 | 66.0 | 1650 | 63.0512 | 0.835 | 0.2747 | 1.1250 | 0.835 | 0.8242 | 0.1436 | 0.0410 |
61.0482 | 67.0 | 1675 | 63.0634 | 0.85 | 0.2671 | 1.1270 | 0.85 | 0.8397 | 0.1376 | 0.0419 |
61.0482 | 68.0 | 1700 | 63.0609 | 0.835 | 0.2717 | 1.1311 | 0.835 | 0.8295 | 0.1365 | 0.0411 |
61.0482 | 69.0 | 1725 | 63.0513 | 0.835 | 0.2707 | 1.1261 | 0.835 | 0.8223 | 0.1461 | 0.0412 |
61.0482 | 70.0 | 1750 | 63.0510 | 0.845 | 0.2712 | 1.1219 | 0.845 | 0.8396 | 0.1369 | 0.0411 |
61.0482 | 71.0 | 1775 | 63.0530 | 0.845 | 0.2688 | 1.1244 | 0.845 | 0.8364 | 0.1403 | 0.0412 |
61.0482 | 72.0 | 1800 | 63.0456 | 0.84 | 0.2665 | 1.1204 | 0.8400 | 0.8293 | 0.1341 | 0.0400 |
61.0482 | 73.0 | 1825 | 63.0459 | 0.845 | 0.2680 | 1.1244 | 0.845 | 0.8360 | 0.1430 | 0.0398 |
61.0482 | 74.0 | 1850 | 63.0773 | 0.85 | 0.2684 | 1.1291 | 0.85 | 0.8440 | 0.1386 | 0.0410 |
61.0482 | 75.0 | 1875 | 63.0497 | 0.85 | 0.2664 | 1.1285 | 0.85 | 0.8388 | 0.1293 | 0.0392 |
61.0482 | 76.0 | 1900 | 63.0483 | 0.845 | 0.2695 | 1.1256 | 0.845 | 0.8352 | 0.1440 | 0.0409 |
61.0482 | 77.0 | 1925 | 63.0352 | 0.845 | 0.2680 | 1.1229 | 0.845 | 0.8352 | 0.1420 | 0.0398 |
61.0482 | 78.0 | 1950 | 63.0291 | 0.845 | 0.2701 | 1.1225 | 0.845 | 0.8352 | 0.1342 | 0.0394 |
61.0482 | 79.0 | 1975 | 63.0508 | 0.85 | 0.2695 | 1.1224 | 0.85 | 0.8388 | 0.1418 | 0.0399 |
60.9704 | 80.0 | 2000 | 63.0510 | 0.85 | 0.2708 | 1.1169 | 0.85 | 0.8388 | 0.1376 | 0.0394 |
60.9704 | 81.0 | 2025 | 63.0460 | 0.85 | 0.2648 | 1.1205 | 0.85 | 0.8440 | 0.1357 | 0.0397 |
60.9704 | 82.0 | 2050 | 63.0505 | 0.845 | 0.2697 | 1.1148 | 0.845 | 0.8352 | 0.1464 | 0.0392 |
60.9704 | 83.0 | 2075 | 63.0425 | 0.845 | 0.2651 | 1.1229 | 0.845 | 0.8352 | 0.1396 | 0.0389 |
60.9704 | 84.0 | 2100 | 63.0398 | 0.845 | 0.2664 | 1.1197 | 0.845 | 0.8337 | 0.1330 | 0.0388 |
60.9704 | 85.0 | 2125 | 63.0355 | 0.845 | 0.2667 | 1.1192 | 0.845 | 0.8360 | 0.1307 | 0.0387 |
60.9704 | 86.0 | 2150 | 63.0386 | 0.85 | 0.2649 | 1.1223 | 0.85 | 0.8409 | 0.1279 | 0.0379 |
60.9704 | 87.0 | 2175 | 63.0405 | 0.85 | 0.2642 | 1.1218 | 0.85 | 0.8409 | 0.1437 | 0.0378 |
60.9704 | 88.0 | 2200 | 63.0363 | 0.85 | 0.2667 | 1.1165 | 0.85 | 0.8388 | 0.1320 | 0.0390 |
60.9704 | 89.0 | 2225 | 63.0456 | 0.845 | 0.2644 | 1.1180 | 0.845 | 0.8352 | 0.1354 | 0.0381 |
60.9704 | 90.0 | 2250 | 63.0343 | 0.845 | 0.2656 | 1.1159 | 0.845 | 0.8337 | 0.1390 | 0.0385 |
60.9704 | 91.0 | 2275 | 63.0391 | 0.85 | 0.2654 | 1.1194 | 0.85 | 0.8409 | 0.1389 | 0.0380 |
60.9704 | 92.0 | 2300 | 63.0354 | 0.85 | 0.2665 | 1.1203 | 0.85 | 0.8409 | 0.1419 | 0.0377 |
60.9704 | 93.0 | 2325 | 63.0272 | 0.845 | 0.2650 | 1.1166 | 0.845 | 0.8381 | 0.1370 | 0.0381 |
60.9704 | 94.0 | 2350 | 63.0313 | 0.85 | 0.2647 | 1.1181 | 0.85 | 0.8409 | 0.1322 | 0.0380 |
60.9704 | 95.0 | 2375 | 63.0220 | 0.845 | 0.2658 | 1.1197 | 0.845 | 0.8357 | 0.1311 | 0.0378 |
60.9704 | 96.0 | 2400 | 63.0345 | 0.85 | 0.2639 | 1.1179 | 0.85 | 0.8409 | 0.1284 | 0.0381 |
60.9704 | 97.0 | 2425 | 63.0330 | 0.845 | 0.2651 | 1.1163 | 0.845 | 0.8352 | 0.1391 | 0.0382 |
60.9704 | 98.0 | 2450 | 63.0302 | 0.85 | 0.2646 | 1.1182 | 0.85 | 0.8409 | 0.1311 | 0.0380 |
60.9704 | 99.0 | 2475 | 63.0287 | 0.85 | 0.2646 | 1.1175 | 0.85 | 0.8409 | 0.1250 | 0.0380 |
60.9392 | 100.0 | 2500 | 63.0326 | 0.85 | 0.2647 | 1.1178 | 0.85 | 0.8409 | 0.1296 | 0.0380 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2