generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd

This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 125 1.0816 0.622 0.5119 2.5026 0.622 0.6194 0.0740 0.1657
No log 2.0 250 0.8028 0.715 0.3936 2.1454 0.715 0.7158 0.0651 0.1017
No log 3.0 375 0.7104 0.7505 0.3455 2.0393 0.7505 0.7464 0.0456 0.0765
0.9841 4.0 500 0.6747 0.7682 0.3267 1.9784 0.7682 0.7703 0.0455 0.0682
0.9841 5.0 625 0.6619 0.7782 0.3169 1.9299 0.7782 0.7752 0.0391 0.0649
0.9841 6.0 750 0.6416 0.7897 0.3058 1.8240 0.7897 0.7923 0.0483 0.0683
0.9841 7.0 875 0.6481 0.786 0.3016 1.8855 0.786 0.7855 0.0501 0.0640
0.259 8.0 1000 0.6273 0.7963 0.2970 1.7135 0.7963 0.7970 0.0454 0.0633
0.259 9.0 1125 0.6484 0.7927 0.3044 1.7079 0.7927 0.7911 0.0601 0.0647
0.259 10.0 1250 0.6504 0.7925 0.3046 1.8241 0.7925 0.7931 0.0577 0.0674
0.259 11.0 1375 0.6137 0.7975 0.2914 1.6742 0.7975 0.7996 0.0567 0.0675
0.133 12.0 1500 0.6092 0.7993 0.2928 1.6077 0.7993 0.8023 0.0600 0.0654
0.133 13.0 1625 0.5905 0.805 0.2842 1.5790 0.805 0.8074 0.0589 0.0623
0.133 14.0 1750 0.5794 0.8077 0.2797 1.4947 0.8077 0.8090 0.0533 0.0579
0.133 15.0 1875 0.5683 0.8075 0.2777 1.4518 0.8075 0.8076 0.0594 0.0565
0.1032 16.0 2000 0.5762 0.8125 0.2794 1.3998 0.8125 0.8146 0.0633 0.0551
0.1032 17.0 2125 0.5529 0.8115 0.2748 1.3595 0.8115 0.8126 0.0638 0.0519
0.1032 18.0 2250 0.5669 0.8133 0.2759 1.3803 0.8133 0.8143 0.0603 0.0547
0.1032 19.0 2375 0.5549 0.8177 0.2716 1.3258 0.8178 0.8186 0.0625 0.0527
0.0832 20.0 2500 0.5576 0.8147 0.2737 1.3814 0.8148 0.8183 0.0627 0.0513
0.0832 21.0 2625 0.5336 0.8247 0.2609 1.2941 0.8247 0.8243 0.0626 0.0476
0.0832 22.0 2750 0.5276 0.8257 0.2595 1.2491 0.8257 0.8262 0.0633 0.0455
0.0832 23.0 2875 0.5313 0.8193 0.2603 1.2685 0.8193 0.8198 0.0618 0.0466
0.0715 24.0 3000 0.5208 0.826 0.2575 1.2280 0.826 0.8266 0.0644 0.0468
0.0715 25.0 3125 0.5205 0.8233 0.2591 1.2235 0.8233 0.8235 0.0615 0.0459
0.0715 26.0 3250 0.5067 0.8293 0.2536 1.2028 0.8293 0.8298 0.0630 0.0433
0.0715 27.0 3375 0.5207 0.8245 0.2591 1.2148 0.8245 0.8256 0.0692 0.0449
0.0647 28.0 3500 0.5197 0.824 0.2596 1.1765 0.824 0.8242 0.0690 0.0469
0.0647 29.0 3625 0.5086 0.8315 0.2531 1.1762 0.8315 0.8319 0.0704 0.0428
0.0647 30.0 3750 0.5025 0.8313 0.2509 1.1560 0.8313 0.8314 0.0687 0.0439
0.0647 31.0 3875 0.5073 0.832 0.2527 1.1743 0.832 0.8323 0.0662 0.0426
0.0618 32.0 4000 0.5068 0.8303 0.2526 1.1644 0.8303 0.8304 0.0679 0.0422
0.0618 33.0 4125 0.5086 0.8325 0.2526 1.1658 0.8325 0.8326 0.0671 0.0415
0.0618 34.0 4250 0.5114 0.833 0.2540 1.1694 0.833 0.8326 0.0649 0.0440
0.0618 35.0 4375 0.5104 0.8305 0.2541 1.1399 0.8305 0.8309 0.0666 0.0426
0.0601 36.0 4500 0.5122 0.8307 0.2547 1.1755 0.8308 0.8309 0.0689 0.0435
0.0601 37.0 4625 0.5122 0.8323 0.2543 1.1448 0.8323 0.8326 0.0698 0.0429
0.0601 38.0 4750 0.5144 0.8307 0.2554 1.1444 0.8308 0.8308 0.0699 0.0414
0.0601 39.0 4875 0.5155 0.8307 0.2553 1.1524 0.8308 0.8308 0.0722 0.0430
0.0593 40.0 5000 0.5132 0.8315 0.2543 1.1554 0.8315 0.8318 0.0721 0.0423
0.0593 41.0 5125 0.5153 0.8335 0.2551 1.1557 0.8335 0.8332 0.0700 0.0423
0.0593 42.0 5250 0.5141 0.8313 0.2545 1.1530 0.8313 0.8314 0.0728 0.0419
0.0593 43.0 5375 0.5159 0.8313 0.2551 1.1434 0.8313 0.8312 0.0756 0.0425
0.0587 44.0 5500 0.5164 0.833 0.2548 1.1469 0.833 0.8329 0.0688 0.0428
0.0587 45.0 5625 0.5170 0.8325 0.2553 1.1486 0.8325 0.8324 0.0723 0.0426
0.0587 46.0 5750 0.5188 0.8325 0.2559 1.1478 0.8325 0.8324 0.0731 0.0423
0.0587 47.0 5875 0.5188 0.8325 0.2557 1.1515 0.8325 0.8323 0.0702 0.0424
0.0583 48.0 6000 0.5195 0.8327 0.2559 1.1477 0.8327 0.8325 0.0702 0.0427
0.0583 49.0 6125 0.5194 0.8325 0.2559 1.1464 0.8325 0.8324 0.0713 0.0426
0.0583 50.0 6250 0.5198 0.833 0.2560 1.1465 0.833 0.8328 0.0719 0.0425

Framework versions