generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_kd

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 125 1.2844 0.5403 0.5889 3.0582 0.5403 0.5275 0.0742 0.2209
No log 2.0 250 0.9687 0.655 0.4587 2.4358 0.655 0.6414 0.0559 0.1296
No log 3.0 375 0.8401 0.7063 0.4019 2.2308 0.7063 0.7008 0.0588 0.0990
1.234 4.0 500 0.8080 0.7145 0.3874 2.1628 0.7145 0.7163 0.0487 0.0951
1.234 5.0 625 0.7772 0.7238 0.3755 2.0380 0.7237 0.7167 0.0421 0.0914
1.234 6.0 750 0.7530 0.7498 0.3484 2.1346 0.7498 0.7464 0.0477 0.0774
1.234 7.0 875 0.7034 0.7652 0.3267 2.0596 0.7652 0.7664 0.0467 0.0678
0.3976 8.0 1000 0.7390 0.7715 0.3350 2.0568 0.7715 0.7704 0.0448 0.0763
0.3976 9.0 1125 0.7019 0.7762 0.3209 2.0168 0.7762 0.7768 0.0556 0.0769
0.3976 10.0 1250 0.7318 0.7668 0.3346 2.1148 0.7668 0.7699 0.0529 0.0792
0.3976 11.0 1375 0.7083 0.7782 0.3213 2.0671 0.7782 0.7775 0.0452 0.0756
0.1591 12.0 1500 0.7535 0.7668 0.3424 2.1407 0.7668 0.7636 0.0564 0.0845
0.1591 13.0 1625 0.7117 0.775 0.3288 2.0935 0.775 0.7766 0.0525 0.0785
0.1591 14.0 1750 0.6421 0.785 0.3039 1.9939 0.785 0.7860 0.0512 0.0643
0.1591 15.0 1875 0.6475 0.7865 0.3050 1.9301 0.7865 0.7867 0.0552 0.0636
0.1125 16.0 2000 0.6477 0.7893 0.3064 1.9442 0.7893 0.7920 0.0556 0.0684
0.1125 17.0 2125 0.6509 0.7883 0.3113 1.8957 0.7883 0.7907 0.0498 0.0710
0.1125 18.0 2250 0.6291 0.7925 0.3038 1.8697 0.7925 0.7963 0.0512 0.0677
0.1125 19.0 2375 0.6279 0.7963 0.2992 1.8155 0.7963 0.7950 0.0478 0.0647
0.095 20.0 2500 0.6246 0.7937 0.3008 1.7925 0.7937 0.7946 0.0595 0.0659
0.095 21.0 2625 0.6149 0.7953 0.2962 1.8237 0.7953 0.7951 0.0547 0.0590
0.095 22.0 2750 0.6196 0.7953 0.3000 1.8031 0.7953 0.7969 0.0567 0.0643
0.095 23.0 2875 0.6023 0.798 0.2932 1.7663 0.798 0.7983 0.0497 0.0616
0.0829 24.0 3000 0.6107 0.7943 0.2951 1.7755 0.7943 0.7958 0.0564 0.0581
0.0829 25.0 3125 0.5986 0.8015 0.2930 1.7243 0.8015 0.8027 0.0565 0.0574
0.0829 26.0 3250 0.5899 0.8005 0.2886 1.7304 0.8005 0.8021 0.0546 0.0560
0.0829 27.0 3375 0.5836 0.8023 0.2846 1.6865 0.8023 0.8024 0.0479 0.0561
0.074 28.0 3500 0.5824 0.8047 0.2850 1.6817 0.8047 0.8060 0.0524 0.0559
0.074 29.0 3625 0.5760 0.8063 0.2822 1.6505 0.8062 0.8065 0.0500 0.0546
0.074 30.0 3750 0.5819 0.8065 0.2843 1.6667 0.8065 0.8079 0.0563 0.0544
0.074 31.0 3875 0.5800 0.8045 0.2841 1.6658 0.8045 0.8059 0.0511 0.0548
0.0668 32.0 4000 0.5828 0.8053 0.2841 1.6883 0.8053 0.8054 0.0559 0.0547
0.0668 33.0 4125 0.5802 0.8037 0.2838 1.6669 0.8037 0.8038 0.0572 0.0545
0.0668 34.0 4250 0.5772 0.8067 0.2821 1.6588 0.8067 0.8083 0.0520 0.0525
0.0668 35.0 4375 0.5745 0.807 0.2812 1.6524 0.807 0.8072 0.0528 0.0528
0.0631 36.0 4500 0.5770 0.8063 0.2826 1.6433 0.8062 0.8071 0.0559 0.0528
0.0631 37.0 4625 0.5782 0.8007 0.2837 1.5953 0.8007 0.8021 0.0581 0.0541
0.0631 38.0 4750 0.5780 0.8047 0.2829 1.6275 0.8047 0.8052 0.0540 0.0521
0.0631 39.0 4875 0.5759 0.8055 0.2817 1.6162 0.8055 0.8065 0.0528 0.0529
0.0612 40.0 5000 0.5770 0.8047 0.2825 1.6131 0.8047 0.8051 0.0575 0.0524
0.0612 41.0 5125 0.5771 0.8043 0.2819 1.6015 0.8043 0.8048 0.0562 0.0519
0.0612 42.0 5250 0.5776 0.8043 0.2825 1.6152 0.8043 0.8047 0.0566 0.0527
0.0612 43.0 5375 0.5793 0.8057 0.2830 1.6196 0.8057 0.8065 0.0538 0.0527
0.06 44.0 5500 0.5801 0.8053 0.2835 1.6183 0.8053 0.8060 0.0618 0.0527
0.06 45.0 5625 0.5800 0.805 0.2831 1.6057 0.805 0.8055 0.0568 0.0530
0.06 46.0 5750 0.5812 0.805 0.2836 1.6034 0.805 0.8056 0.0577 0.0529
0.06 47.0 5875 0.5809 0.805 0.2834 1.6164 0.805 0.8056 0.0580 0.0526
0.0593 48.0 6000 0.5810 0.8057 0.2834 1.6108 0.8057 0.8064 0.0617 0.0525
0.0593 49.0 6125 0.5812 0.8053 0.2836 1.6140 0.8053 0.8058 0.0570 0.0527
0.0593 50.0 6250 0.5815 0.8055 0.2836 1.6135 0.8055 0.8061 0.0597 0.0526

Framework versions