<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_simkd
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0914
- Accuracy: 0.8157
- Brier Loss: 0.5152
- Nll: 1.5529
- F1 Micro: 0.8157
- F1 Macro: 0.8167
- Ece: 0.4370
- Aurc: 0.1010
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
0.149 | 1.0 | 1000 | 0.1482 | 0.1598 | 0.9334 | 6.4910 | 0.1598 | 0.1128 | 0.0991 | 0.7250 |
0.1278 | 2.0 | 2000 | 0.1248 | 0.5295 | 0.8391 | 3.1275 | 0.5295 | 0.4951 | 0.4011 | 0.2355 |
0.1156 | 3.0 | 3000 | 0.1146 | 0.6388 | 0.7532 | 2.4008 | 0.6388 | 0.6238 | 0.4480 | 0.1552 |
0.1094 | 4.0 | 4000 | 0.1089 | 0.7085 | 0.7069 | 2.1449 | 0.7085 | 0.7039 | 0.4850 | 0.1225 |
0.105 | 5.0 | 5000 | 0.1063 | 0.7252 | 0.6970 | 2.0868 | 0.7252 | 0.7291 | 0.4969 | 0.1087 |
0.1011 | 6.0 | 6000 | 0.1038 | 0.7472 | 0.6562 | 2.1759 | 0.7472 | 0.7478 | 0.4844 | 0.1095 |
0.097 | 7.0 | 7000 | 0.1043 | 0.7415 | 0.6509 | 2.5078 | 0.7415 | 0.7404 | 0.4668 | 0.1364 |
0.0946 | 8.0 | 8000 | 0.1022 | 0.7508 | 0.6416 | 2.1930 | 0.7508 | 0.7550 | 0.4714 | 0.1300 |
0.0916 | 9.0 | 9000 | 0.0995 | 0.7642 | 0.6271 | 1.9398 | 0.7642 | 0.7691 | 0.4791 | 0.1086 |
0.0901 | 10.0 | 10000 | 0.1013 | 0.747 | 0.6277 | 2.3724 | 0.747 | 0.7538 | 0.4538 | 0.1227 |
0.0881 | 11.0 | 11000 | 0.0991 | 0.7752 | 0.6037 | 1.9848 | 0.7752 | 0.7784 | 0.4696 | 0.1054 |
0.0868 | 12.0 | 12000 | 0.0983 | 0.7738 | 0.6074 | 2.0011 | 0.7738 | 0.7757 | 0.4741 | 0.0996 |
0.0855 | 13.0 | 13000 | 0.0977 | 0.7833 | 0.5864 | 1.9790 | 0.7833 | 0.7868 | 0.4633 | 0.1068 |
0.0845 | 14.0 | 14000 | 0.0986 | 0.782 | 0.5928 | 2.0415 | 0.782 | 0.7847 | 0.4645 | 0.1158 |
0.083 | 15.0 | 15000 | 0.0974 | 0.78 | 0.5793 | 2.0235 | 0.78 | 0.7857 | 0.4455 | 0.1243 |
0.0821 | 16.0 | 16000 | 0.0975 | 0.7823 | 0.5776 | 2.0363 | 0.7823 | 0.7859 | 0.4462 | 0.1238 |
0.0811 | 17.0 | 17000 | 0.0962 | 0.7883 | 0.5667 | 2.0085 | 0.7883 | 0.7907 | 0.4474 | 0.1108 |
0.0803 | 18.0 | 18000 | 0.0969 | 0.7833 | 0.5720 | 2.0028 | 0.7833 | 0.7840 | 0.4421 | 0.1276 |
0.0801 | 19.0 | 19000 | 0.0962 | 0.7823 | 0.5727 | 1.9412 | 0.7823 | 0.7847 | 0.4447 | 0.1182 |
0.0794 | 20.0 | 20000 | 0.0961 | 0.7847 | 0.5681 | 1.9442 | 0.7847 | 0.7851 | 0.4449 | 0.1121 |
0.0786 | 21.0 | 21000 | 0.0993 | 0.7612 | 0.5748 | 2.2878 | 0.7612 | 0.7627 | 0.4088 | 0.1494 |
0.0776 | 22.0 | 22000 | 0.0947 | 0.797 | 0.5491 | 1.8933 | 0.797 | 0.7986 | 0.4379 | 0.1211 |
0.0771 | 23.0 | 23000 | 0.0955 | 0.7893 | 0.5564 | 1.8974 | 0.7893 | 0.7918 | 0.4391 | 0.1124 |
0.0772 | 24.0 | 24000 | 0.0956 | 0.788 | 0.5524 | 1.9541 | 0.788 | 0.7898 | 0.4309 | 0.1166 |
0.0768 | 25.0 | 25000 | 0.0970 | 0.7748 | 0.5568 | 2.0627 | 0.7748 | 0.7776 | 0.4152 | 0.1264 |
0.0765 | 26.0 | 26000 | 0.0939 | 0.7975 | 0.5448 | 1.7874 | 0.7975 | 0.7996 | 0.4397 | 0.1086 |
0.0759 | 27.0 | 27000 | 0.0944 | 0.797 | 0.5425 | 1.8354 | 0.797 | 0.7982 | 0.4328 | 0.1185 |
0.0755 | 28.0 | 28000 | 0.0938 | 0.7993 | 0.5399 | 1.6911 | 0.7993 | 0.7993 | 0.4391 | 0.1025 |
0.0754 | 29.0 | 29000 | 0.0945 | 0.797 | 0.5387 | 1.8083 | 0.797 | 0.7980 | 0.4323 | 0.1117 |
0.075 | 30.0 | 30000 | 0.0941 | 0.8005 | 0.5353 | 1.7803 | 0.8005 | 0.8020 | 0.4318 | 0.1128 |
0.0745 | 31.0 | 31000 | 0.0928 | 0.805 | 0.5282 | 1.6621 | 0.805 | 0.8070 | 0.4338 | 0.1107 |
0.0747 | 32.0 | 32000 | 0.0935 | 0.806 | 0.5316 | 1.6745 | 0.806 | 0.8066 | 0.4368 | 0.1111 |
0.0743 | 33.0 | 33000 | 0.0928 | 0.8095 | 0.5288 | 1.7115 | 0.8095 | 0.8096 | 0.4401 | 0.1045 |
0.074 | 34.0 | 34000 | 0.0927 | 0.8063 | 0.5286 | 1.6801 | 0.8062 | 0.8064 | 0.4378 | 0.1001 |
0.0734 | 35.0 | 35000 | 0.0925 | 0.8083 | 0.5260 | 1.6524 | 0.8083 | 0.8102 | 0.4364 | 0.1066 |
0.0734 | 36.0 | 36000 | 0.0924 | 0.8087 | 0.5252 | 1.6727 | 0.8087 | 0.8106 | 0.4352 | 0.1077 |
0.0733 | 37.0 | 37000 | 0.0920 | 0.8133 | 0.5215 | 1.6062 | 0.8133 | 0.8147 | 0.4399 | 0.1000 |
0.0733 | 38.0 | 38000 | 0.0924 | 0.8083 | 0.5243 | 1.6319 | 0.8083 | 0.8100 | 0.4343 | 0.1063 |
0.0732 | 39.0 | 39000 | 0.0921 | 0.8105 | 0.5222 | 1.5823 | 0.8105 | 0.8106 | 0.4363 | 0.1034 |
0.073 | 40.0 | 40000 | 0.0917 | 0.8157 | 0.5203 | 1.5771 | 0.8157 | 0.8163 | 0.4414 | 0.1014 |
0.0728 | 41.0 | 41000 | 0.0916 | 0.8153 | 0.5192 | 1.5726 | 0.8153 | 0.8163 | 0.4395 | 0.1033 |
0.0729 | 42.0 | 42000 | 0.0916 | 0.8133 | 0.5188 | 1.5495 | 0.8133 | 0.8145 | 0.4392 | 0.1026 |
0.0726 | 43.0 | 43000 | 0.0917 | 0.816 | 0.5185 | 1.5969 | 0.816 | 0.8169 | 0.4395 | 0.1054 |
0.0728 | 44.0 | 44000 | 0.0914 | 0.8163 | 0.5164 | 1.5257 | 0.8163 | 0.8167 | 0.4388 | 0.1023 |
0.0725 | 45.0 | 45000 | 0.0914 | 0.8153 | 0.5165 | 1.5699 | 0.8153 | 0.8161 | 0.4386 | 0.1012 |
0.0723 | 46.0 | 46000 | 0.0915 | 0.816 | 0.5160 | 1.5653 | 0.816 | 0.8171 | 0.4386 | 0.1008 |
0.0723 | 47.0 | 47000 | 0.0914 | 0.8155 | 0.5159 | 1.5478 | 0.8155 | 0.8165 | 0.4380 | 0.0997 |
0.0721 | 48.0 | 48000 | 0.0914 | 0.816 | 0.5156 | 1.5579 | 0.816 | 0.8169 | 0.4379 | 0.1006 |
0.0725 | 49.0 | 49000 | 0.0914 | 0.8155 | 0.5153 | 1.5636 | 0.8155 | 0.8165 | 0.4369 | 0.1009 |
0.0721 | 50.0 | 50000 | 0.0914 | 0.8157 | 0.5152 | 1.5529 | 0.8157 | 0.8167 | 0.4370 | 0.1010 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2