<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_hint
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 73.8137
- Accuracy: 0.8137
- Brier Loss: 0.3252
- Nll: 2.0673
- F1 Micro: 0.8137
- F1 Macro: 0.8140
- Ece: 0.1539
- Aurc: 0.0483
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 250 | 76.1878 | 0.5783 | 0.5528 | 2.6021 | 0.5783 | 0.5765 | 0.0527 | 0.2026 |
76.425 | 2.0 | 500 | 75.1954 | 0.6558 | 0.4561 | 2.2844 | 0.6558 | 0.6549 | 0.0488 | 0.1337 |
76.425 | 3.0 | 750 | 74.7574 | 0.716 | 0.3935 | 2.2465 | 0.7160 | 0.7170 | 0.0489 | 0.0983 |
74.5686 | 4.0 | 1000 | 74.5759 | 0.7265 | 0.3815 | 2.1845 | 0.7265 | 0.7306 | 0.0445 | 0.0951 |
74.5686 | 5.0 | 1250 | 74.4539 | 0.7245 | 0.3774 | 2.2022 | 0.7245 | 0.7264 | 0.0560 | 0.0919 |
73.9702 | 6.0 | 1500 | 74.4498 | 0.7468 | 0.3680 | 2.1854 | 0.7468 | 0.7555 | 0.0829 | 0.0826 |
73.9702 | 7.0 | 1750 | 74.2701 | 0.773 | 0.3350 | 2.1685 | 0.7730 | 0.7724 | 0.0855 | 0.0683 |
73.5091 | 8.0 | 2000 | 74.2610 | 0.7675 | 0.3548 | 2.1544 | 0.7675 | 0.7704 | 0.1155 | 0.0709 |
73.5091 | 9.0 | 2250 | 74.2621 | 0.772 | 0.3501 | 2.2087 | 0.772 | 0.7703 | 0.1242 | 0.0638 |
73.2311 | 10.0 | 2500 | 74.2978 | 0.7592 | 0.3768 | 2.1953 | 0.7592 | 0.7592 | 0.1462 | 0.0738 |
73.2311 | 11.0 | 2750 | 74.3242 | 0.7645 | 0.3803 | 2.1374 | 0.7645 | 0.7603 | 0.1528 | 0.0747 |
73.0554 | 12.0 | 3000 | 74.2177 | 0.7847 | 0.3545 | 2.1892 | 0.7847 | 0.7862 | 0.1411 | 0.0650 |
73.0554 | 13.0 | 3250 | 74.2360 | 0.779 | 0.3598 | 2.1518 | 0.779 | 0.7781 | 0.1513 | 0.0629 |
72.9294 | 14.0 | 3500 | 74.2339 | 0.7772 | 0.3684 | 2.1404 | 0.7773 | 0.7799 | 0.1583 | 0.0644 |
72.9294 | 15.0 | 3750 | 74.1185 | 0.7953 | 0.3416 | 2.1394 | 0.7953 | 0.7966 | 0.1436 | 0.0562 |
72.8246 | 16.0 | 4000 | 74.1754 | 0.7915 | 0.3498 | 2.1599 | 0.7915 | 0.7929 | 0.1525 | 0.0606 |
72.8246 | 17.0 | 4250 | 74.2033 | 0.7885 | 0.3559 | 2.2161 | 0.7885 | 0.7898 | 0.1558 | 0.0597 |
72.7339 | 18.0 | 4500 | 74.2018 | 0.7873 | 0.3640 | 2.1417 | 0.7873 | 0.7881 | 0.1590 | 0.0613 |
72.7339 | 19.0 | 4750 | 74.1204 | 0.7913 | 0.3517 | 2.1363 | 0.7913 | 0.7927 | 0.1553 | 0.0601 |
72.6572 | 20.0 | 5000 | 74.0625 | 0.7975 | 0.3431 | 2.1165 | 0.7975 | 0.7989 | 0.1530 | 0.0587 |
72.6572 | 21.0 | 5250 | 74.2249 | 0.7893 | 0.3609 | 2.1703 | 0.7893 | 0.7909 | 0.1663 | 0.0620 |
72.5815 | 22.0 | 5500 | 74.1181 | 0.8025 | 0.3400 | 2.1457 | 0.8025 | 0.8024 | 0.1531 | 0.0543 |
72.5815 | 23.0 | 5750 | 74.0536 | 0.8113 | 0.3293 | 2.1567 | 0.8113 | 0.8121 | 0.1489 | 0.0511 |
72.5166 | 24.0 | 6000 | 74.0110 | 0.8073 | 0.3345 | 2.1831 | 0.8073 | 0.8072 | 0.1487 | 0.0524 |
72.5166 | 25.0 | 6250 | 74.1061 | 0.8005 | 0.3424 | 2.1431 | 0.8005 | 0.8013 | 0.1573 | 0.0576 |
72.4615 | 26.0 | 6500 | 74.0349 | 0.8013 | 0.3399 | 2.1286 | 0.8013 | 0.7997 | 0.1565 | 0.0548 |
72.4615 | 27.0 | 6750 | 74.0363 | 0.805 | 0.3416 | 2.1198 | 0.805 | 0.8057 | 0.1551 | 0.0573 |
72.4072 | 28.0 | 7000 | 74.0054 | 0.8107 | 0.3322 | 2.1186 | 0.8108 | 0.8104 | 0.1495 | 0.0528 |
72.4072 | 29.0 | 7250 | 74.0448 | 0.8043 | 0.3429 | 2.0845 | 0.8043 | 0.8058 | 0.1560 | 0.0563 |
72.3615 | 30.0 | 7500 | 73.9915 | 0.805 | 0.3376 | 2.1142 | 0.805 | 0.8059 | 0.1571 | 0.0527 |
72.3615 | 31.0 | 7750 | 73.9340 | 0.81 | 0.3284 | 2.0976 | 0.81 | 0.8101 | 0.1516 | 0.0500 |
72.3206 | 32.0 | 8000 | 73.9701 | 0.814 | 0.3264 | 2.1364 | 0.8140 | 0.8139 | 0.1488 | 0.0534 |
72.3206 | 33.0 | 8250 | 73.8978 | 0.8115 | 0.3287 | 2.1375 | 0.8115 | 0.8110 | 0.1517 | 0.0487 |
72.289 | 34.0 | 8500 | 73.8993 | 0.8175 | 0.3185 | 2.0686 | 0.8175 | 0.8196 | 0.1443 | 0.0505 |
72.289 | 35.0 | 8750 | 73.8655 | 0.814 | 0.3231 | 2.0881 | 0.8140 | 0.8149 | 0.1504 | 0.0488 |
72.2572 | 36.0 | 9000 | 73.8631 | 0.8153 | 0.3190 | 2.0729 | 0.8153 | 0.8158 | 0.1479 | 0.0489 |
72.2572 | 37.0 | 9250 | 73.8671 | 0.8163 | 0.3200 | 2.1224 | 0.8163 | 0.8154 | 0.1504 | 0.0486 |
72.2292 | 38.0 | 9500 | 73.8828 | 0.8155 | 0.3259 | 2.0859 | 0.8155 | 0.8151 | 0.1502 | 0.0476 |
72.2292 | 39.0 | 9750 | 73.8538 | 0.8115 | 0.3296 | 2.0611 | 0.8115 | 0.8119 | 0.1541 | 0.0493 |
72.2054 | 40.0 | 10000 | 73.8624 | 0.8115 | 0.3260 | 2.0991 | 0.8115 | 0.8113 | 0.1547 | 0.0481 |
72.2054 | 41.0 | 10250 | 73.8335 | 0.819 | 0.3199 | 2.0802 | 0.819 | 0.8189 | 0.1468 | 0.0479 |
72.1861 | 42.0 | 10500 | 73.8582 | 0.8123 | 0.3314 | 2.0555 | 0.8123 | 0.8130 | 0.1548 | 0.0490 |
72.1861 | 43.0 | 10750 | 73.8290 | 0.8153 | 0.3235 | 2.0956 | 0.8153 | 0.8158 | 0.1514 | 0.0480 |
72.1705 | 44.0 | 11000 | 73.8210 | 0.8107 | 0.3291 | 2.0636 | 0.8108 | 0.8112 | 0.1570 | 0.0489 |
72.1705 | 45.0 | 11250 | 73.8179 | 0.8143 | 0.3260 | 2.0835 | 0.8143 | 0.8148 | 0.1534 | 0.0474 |
72.1588 | 46.0 | 11500 | 73.8054 | 0.8117 | 0.3239 | 2.0814 | 0.8117 | 0.8122 | 0.1553 | 0.0479 |
72.1588 | 47.0 | 11750 | 73.8085 | 0.8137 | 0.3251 | 2.0705 | 0.8137 | 0.8138 | 0.1536 | 0.0485 |
72.1506 | 48.0 | 12000 | 73.8144 | 0.814 | 0.3254 | 2.0702 | 0.8140 | 0.8142 | 0.1534 | 0.0483 |
72.1506 | 49.0 | 12250 | 73.8181 | 0.8137 | 0.3252 | 2.0666 | 0.8137 | 0.8141 | 0.1539 | 0.0483 |
72.146 | 50.0 | 12500 | 73.8137 | 0.8137 | 0.3252 | 2.0673 | 0.8137 | 0.8140 | 0.1539 | 0.0483 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2