generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

225-tiny_tobacco3482_kd_NKD_t1.0_g1.5

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 13 5.2368 0.225 0.8876 8.2751 0.225 0.1306 0.3140 0.7919
No log 2.0 26 4.6617 0.385 0.7736 4.0165 0.3850 0.3071 0.3295 0.4033
No log 3.0 39 4.4343 0.525 0.6609 3.4855 0.525 0.4017 0.3068 0.2761
No log 4.0 52 4.2677 0.59 0.5775 2.7458 0.59 0.4879 0.3037 0.1850
No log 5.0 65 4.1495 0.67 0.5044 2.2848 0.67 0.6081 0.3100 0.1336
No log 6.0 78 4.1699 0.71 0.4412 2.9360 0.7100 0.6211 0.2407 0.1076
No log 7.0 91 4.0527 0.725 0.4198 2.1169 0.7250 0.6606 0.2359 0.0993
No log 8.0 104 4.0491 0.715 0.4001 2.1794 0.715 0.6343 0.1955 0.1013
No log 9.0 117 4.2070 0.715 0.4096 2.1137 0.715 0.6363 0.1968 0.1104
No log 10.0 130 4.2307 0.715 0.4030 2.4228 0.715 0.6467 0.1977 0.1054
No log 11.0 143 4.0841 0.73 0.3673 2.2764 0.7300 0.6697 0.1840 0.0781
No log 12.0 156 3.9980 0.74 0.3569 1.7264 0.74 0.6752 0.1822 0.0779
No log 13.0 169 4.0921 0.735 0.3704 1.8601 0.735 0.6818 0.1835 0.0888
No log 14.0 182 3.9026 0.755 0.3362 1.6596 0.755 0.7128 0.1684 0.0757
No log 15.0 195 4.0542 0.765 0.3472 2.0096 0.765 0.7051 0.1789 0.0783
No log 16.0 208 4.0180 0.75 0.3634 1.6543 0.75 0.7364 0.1958 0.0890
No log 17.0 221 3.9665 0.8 0.3330 1.4940 0.8000 0.7935 0.1919 0.0793
No log 18.0 234 3.9523 0.785 0.3225 1.6353 0.785 0.7825 0.1598 0.0719
No log 19.0 247 3.9298 0.79 0.3262 1.8606 0.79 0.7757 0.1785 0.0749
No log 20.0 260 3.9484 0.8 0.3106 1.6615 0.8000 0.8034 0.1692 0.0763
No log 21.0 273 3.9056 0.785 0.2930 1.6180 0.785 0.7499 0.1542 0.0609
No log 22.0 286 3.8094 0.82 0.2765 1.3116 0.82 0.8028 0.1784 0.0532
No log 23.0 299 3.8352 0.81 0.2939 1.5765 0.81 0.7971 0.1592 0.0559
No log 24.0 312 3.9996 0.79 0.3192 1.6863 0.79 0.7914 0.1678 0.0742
No log 25.0 325 3.8680 0.805 0.2932 1.4217 0.805 0.8052 0.1505 0.0578
No log 26.0 338 3.8913 0.8 0.3025 1.6254 0.8000 0.7971 0.1370 0.0607
No log 27.0 351 3.8603 0.815 0.2893 1.6578 0.815 0.8094 0.1659 0.0570
No log 28.0 364 3.9414 0.795 0.2990 1.9161 0.795 0.7900 0.1504 0.0593
No log 29.0 377 3.8802 0.815 0.2836 1.7091 0.815 0.7943 0.1395 0.0565
No log 30.0 390 3.9025 0.8 0.2957 1.7376 0.8000 0.7894 0.1373 0.0594
No log 31.0 403 3.8744 0.835 0.2785 1.5096 0.835 0.8185 0.1405 0.0550
No log 32.0 416 3.8670 0.8 0.2813 1.5817 0.8000 0.7825 0.1279 0.0500
No log 33.0 429 3.9197 0.8 0.2852 1.5082 0.8000 0.7802 0.1488 0.0540
No log 34.0 442 3.9589 0.795 0.3005 1.9897 0.795 0.7872 0.1487 0.0563
No log 35.0 455 3.9669 0.82 0.2863 1.7012 0.82 0.8161 0.1483 0.0551
No log 36.0 468 3.8924 0.81 0.2803 1.5552 0.81 0.7961 0.1322 0.0484
No log 37.0 481 3.9455 0.81 0.2838 1.6590 0.81 0.7989 0.1423 0.0531
No log 38.0 494 3.8957 0.82 0.2726 1.5431 0.82 0.8072 0.1409 0.0482
3.5636 39.0 507 3.9710 0.81 0.2979 1.7156 0.81 0.7989 0.1399 0.0524
3.5636 40.0 520 3.8789 0.83 0.2606 1.5452 0.83 0.8227 0.1323 0.0478
3.5636 41.0 533 3.9488 0.81 0.2839 1.6447 0.81 0.8016 0.1326 0.0509
3.5636 42.0 546 3.9774 0.815 0.2937 1.6907 0.815 0.8111 0.1291 0.0488
3.5636 43.0 559 3.9991 0.805 0.2877 1.7106 0.805 0.7979 0.1504 0.0518
3.5636 44.0 572 3.9634 0.815 0.2798 1.5063 0.815 0.8048 0.1272 0.0493
3.5636 45.0 585 4.0229 0.82 0.2904 1.6439 0.82 0.8156 0.1392 0.0511
3.5636 46.0 598 4.0206 0.82 0.2836 1.5407 0.82 0.8150 0.1233 0.0497
3.5636 47.0 611 4.0351 0.81 0.2835 1.7627 0.81 0.8003 0.1338 0.0486
3.5636 48.0 624 4.0646 0.82 0.2889 1.7694 0.82 0.8150 0.1341 0.0499
3.5636 49.0 637 4.0496 0.815 0.2828 1.7548 0.815 0.8071 0.1391 0.0477
3.5636 50.0 650 4.0914 0.815 0.2917 1.6381 0.815 0.8053 0.1310 0.0502
3.5636 51.0 663 4.0748 0.82 0.2866 1.5646 0.82 0.8148 0.1325 0.0483
3.5636 52.0 676 4.0921 0.82 0.2871 1.5732 0.82 0.8148 0.1381 0.0487
3.5636 53.0 689 4.1093 0.82 0.2886 1.6448 0.82 0.8147 0.1506 0.0481
3.5636 54.0 702 4.1200 0.82 0.2910 1.6446 0.82 0.8150 0.1335 0.0493
3.5636 55.0 715 4.1250 0.815 0.2901 1.5641 0.815 0.8098 0.1386 0.0491
3.5636 56.0 728 4.1340 0.82 0.2893 1.6575 0.82 0.8148 0.1298 0.0489
3.5636 57.0 741 4.1575 0.82 0.2935 1.6360 0.82 0.8150 0.1402 0.0499
3.5636 58.0 754 4.1495 0.82 0.2895 1.6349 0.82 0.8148 0.1398 0.0486
3.5636 59.0 767 4.1582 0.82 0.2909 1.6327 0.82 0.8150 0.1341 0.0487
3.5636 60.0 780 4.1720 0.82 0.2923 1.5746 0.82 0.8150 0.1386 0.0493
3.5636 61.0 793 4.1848 0.825 0.2940 1.6424 0.825 0.8181 0.1380 0.0494
3.5636 62.0 806 4.1880 0.82 0.2939 1.6323 0.82 0.8148 0.1389 0.0488
3.5636 63.0 819 4.1825 0.82 0.2916 1.6920 0.82 0.8150 0.1421 0.0483
3.5636 64.0 832 4.2037 0.82 0.2946 1.6365 0.82 0.8148 0.1393 0.0493
3.5636 65.0 845 4.2096 0.82 0.2948 1.5852 0.82 0.8150 0.1462 0.0493
3.5636 66.0 858 4.2191 0.82 0.2962 1.6349 0.82 0.8150 0.1491 0.0495
3.5636 67.0 871 4.2189 0.82 0.2948 1.6389 0.82 0.8150 0.1313 0.0489
3.5636 68.0 884 4.2243 0.82 0.2947 1.6322 0.82 0.8150 0.1398 0.0491
3.5636 69.0 897 4.2334 0.82 0.2957 1.6398 0.82 0.8150 0.1355 0.0491
3.5636 70.0 910 4.2312 0.82 0.2943 1.6395 0.82 0.8148 0.1419 0.0484
3.5636 71.0 923 4.2376 0.82 0.2956 1.6389 0.82 0.8150 0.1372 0.0490
3.5636 72.0 936 4.2420 0.82 0.2951 1.6368 0.82 0.8150 0.1427 0.0489
3.5636 73.0 949 4.2464 0.82 0.2946 1.6375 0.82 0.8150 0.1449 0.0488
3.5636 74.0 962 4.2540 0.82 0.2956 1.6364 0.82 0.8150 0.1476 0.0489
3.5636 75.0 975 4.2579 0.82 0.2955 1.6361 0.82 0.8150 0.1361 0.0491
3.5636 76.0 988 4.2638 0.82 0.2960 1.6368 0.82 0.8150 0.1483 0.0490
3.1969 77.0 1001 4.2653 0.82 0.2956 1.6950 0.82 0.8150 0.1509 0.0487
3.1969 78.0 1014 4.2708 0.82 0.2965 1.6365 0.82 0.8150 0.1398 0.0490
3.1969 79.0 1027 4.2761 0.82 0.2968 1.6400 0.82 0.8150 0.1399 0.0490
3.1969 80.0 1040 4.2792 0.82 0.2969 1.6381 0.82 0.8150 0.1425 0.0490
3.1969 81.0 1053 4.2801 0.82 0.2963 1.6949 0.82 0.8148 0.1477 0.0487
3.1969 82.0 1066 4.2841 0.82 0.2968 1.6459 0.82 0.8150 0.1425 0.0488
3.1969 83.0 1079 4.2864 0.82 0.2968 1.6378 0.82 0.8150 0.1421 0.0489
3.1969 84.0 1092 4.2918 0.82 0.2973 1.6398 0.82 0.8150 0.1373 0.0491
3.1969 85.0 1105 4.2930 0.82 0.2970 1.6408 0.82 0.8150 0.1486 0.0490
3.1969 86.0 1118 4.2956 0.82 0.2973 1.6420 0.82 0.8150 0.1427 0.0489
3.1969 87.0 1131 4.2988 0.82 0.2976 1.6390 0.82 0.8150 0.1374 0.0491
3.1969 88.0 1144 4.2995 0.82 0.2974 1.6509 0.82 0.8150 0.1427 0.0489
3.1969 89.0 1157 4.3026 0.82 0.2976 1.6418 0.82 0.8150 0.1375 0.0490
3.1969 90.0 1170 4.3028 0.82 0.2974 1.6445 0.82 0.8150 0.1453 0.0488
3.1969 91.0 1183 4.3054 0.82 0.2976 1.6443 0.82 0.8150 0.1402 0.0488
3.1969 92.0 1196 4.3060 0.82 0.2975 1.6530 0.82 0.8150 0.1454 0.0488
3.1969 93.0 1209 4.3074 0.82 0.2975 1.6961 0.82 0.8150 0.1453 0.0488
3.1969 94.0 1222 4.3078 0.82 0.2975 1.6638 0.82 0.8150 0.1454 0.0488
3.1969 95.0 1235 4.3092 0.82 0.2976 1.6959 0.82 0.8150 0.1454 0.0488
3.1969 96.0 1248 4.3094 0.82 0.2976 1.6957 0.82 0.8150 0.1454 0.0488
3.1969 97.0 1261 4.3106 0.82 0.2977 1.6961 0.82 0.8150 0.1455 0.0489
3.1969 98.0 1274 4.3110 0.82 0.2977 1.6960 0.82 0.8150 0.1455 0.0488
3.1969 99.0 1287 4.3111 0.82 0.2977 1.6959 0.82 0.8150 0.1454 0.0488
3.1969 100.0 1300 4.3111 0.82 0.2977 1.6959 0.82 0.8150 0.1454 0.0488

Framework versions