generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

cdip-small_rvl_cdip-NK1000_kd_test

This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
1.7536 1.0 667 0.9652 0.6695 0.4474 2.2965 0.6695 0.6595 0.0494 0.1257
0.8802 2.0 1334 0.7683 0.7195 0.3806 2.0303 0.7195 0.7116 0.0473 0.0920
0.5767 3.0 2001 0.6276 0.7698 0.3253 1.9446 0.7698 0.7711 0.0436 0.0684
0.4263 4.0 2668 0.6095 0.7785 0.3110 1.9810 0.7785 0.7810 0.0474 0.0624
0.3987 5.0 3335 0.5608 0.791 0.2939 1.8539 0.791 0.7918 0.0504 0.0557
0.3179 6.0 4002 0.6057 0.7935 0.3027 1.8778 0.7935 0.7940 0.0811 0.0548
0.2428 7.0 4669 0.5828 0.8043 0.2905 1.8616 0.8043 0.8050 0.0662 0.0520
0.2094 8.0 5336 0.5812 0.7957 0.2973 1.8459 0.7957 0.8019 0.0783 0.0532
0.1715 9.0 6003 0.6152 0.7987 0.2993 1.9533 0.7987 0.7998 0.0723 0.0539
0.1508 10.0 6670 0.5442 0.808 0.2820 1.8159 0.808 0.8097 0.0836 0.0476
0.1434 11.0 7337 0.4881 0.828 0.2549 1.6938 0.828 0.8286 0.0610 0.0410
0.1267 12.0 8004 0.4720 0.8365 0.2465 1.6878 0.8365 0.8360 0.0576 0.0400
0.115 13.0 8671 0.4648 0.8335 0.2482 1.6871 0.8335 0.8353 0.0630 0.0387
0.1112 14.0 9338 0.4777 0.8317 0.2509 1.6393 0.8317 0.8312 0.0614 0.0418
0.1002 15.0 10005 0.4684 0.8333 0.2484 1.6054 0.8333 0.8335 0.0657 0.0392
0.0944 16.0 10672 0.4693 0.8365 0.2480 1.6381 0.8365 0.8366 0.0658 0.0383
0.0934 17.0 11339 0.4534 0.8323 0.2465 1.6420 0.8323 0.8343 0.0561 0.0373
0.0835 18.0 12006 0.4512 0.8357 0.2435 1.6301 0.8357 0.8367 0.0575 0.0372
0.08 19.0 12673 0.4345 0.838 0.2394 1.6382 0.838 0.8398 0.0562 0.0366
0.0819 20.0 13340 0.4356 0.838 0.2374 1.5973 0.838 0.8384 0.0588 0.0364
0.0709 21.0 14007 0.4484 0.8415 0.2368 1.6231 0.8415 0.8411 0.0595 0.0368
0.0691 22.0 14674 0.4194 0.8495 0.2287 1.5968 0.8495 0.8505 0.0531 0.0335
0.068 23.0 15341 0.4308 0.8413 0.2346 1.5599 0.8413 0.8410 0.0542 0.0360
0.0641 24.0 16008 0.4209 0.8405 0.2336 1.5539 0.8405 0.8422 0.0590 0.0339
0.0617 25.0 16675 0.4181 0.841 0.2352 1.5735 0.841 0.8435 0.0568 0.0356
0.0633 26.0 17342 0.4193 0.8508 0.2286 1.5299 0.8508 0.8510 0.0650 0.0348
0.0569 27.0 18009 0.4065 0.8468 0.2278 1.5267 0.8468 0.8479 0.0546 0.0332
0.0571 28.0 18676 0.4109 0.8498 0.2255 1.5147 0.8498 0.8499 0.0590 0.0331
0.0543 29.0 19343 0.4026 0.8482 0.2250 1.5187 0.8482 0.8498 0.0623 0.0327
0.0543 30.0 20010 0.4124 0.847 0.2293 1.5125 0.847 0.8473 0.0605 0.0330
0.0536 31.0 20677 0.4022 0.851 0.2238 1.5100 0.851 0.8527 0.0594 0.0323
0.0522 32.0 21344 0.4120 0.8475 0.2290 1.5044 0.8475 0.8483 0.0633 0.0327
0.0493 33.0 22011 0.3990 0.8492 0.2258 1.5197 0.8492 0.8503 0.0589 0.0318
0.0512 34.0 22678 0.3983 0.85 0.2251 1.4644 0.85 0.8503 0.0597 0.0319
0.0517 35.0 23345 0.3969 0.8465 0.2257 1.4814 0.8465 0.8479 0.0630 0.0309
0.0477 36.0 24012 0.3939 0.8528 0.2237 1.4797 0.8528 0.8531 0.0604 0.0316
0.0482 37.0 24679 0.3934 0.852 0.2218 1.4595 0.852 0.8527 0.0613 0.0316
0.0481 38.0 25346 0.3930 0.8532 0.2217 1.4561 0.8532 0.8544 0.0593 0.0306
0.0477 39.0 26013 0.3875 0.8512 0.2202 1.4610 0.8512 0.8523 0.0609 0.0310
0.048 40.0 26680 0.3900 0.8538 0.2202 1.4541 0.8537 0.8546 0.0629 0.0307
0.0448 41.0 27347 0.3901 0.8525 0.2221 1.4519 0.8525 0.8532 0.0621 0.0308
0.0454 42.0 28014 0.3858 0.851 0.2186 1.4554 0.851 0.8519 0.0633 0.0298
0.0464 43.0 28681 0.3861 0.8528 0.2197 1.4516 0.8528 0.8535 0.0618 0.0307
0.0444 44.0 29348 0.3824 0.8548 0.2176 1.4288 0.8547 0.8557 0.0607 0.0299
0.0461 45.0 30015 0.3833 0.8555 0.2181 1.4330 0.8555 0.8566 0.0606 0.0302
0.0442 46.0 30682 0.3830 0.8552 0.2174 1.4358 0.8552 0.8560 0.0604 0.0302
0.0456 47.0 31349 0.3797 0.8552 0.2173 1.4264 0.8552 0.8560 0.0596 0.0297
0.0447 48.0 32016 0.3811 0.8558 0.2176 1.4273 0.8558 0.8566 0.0595 0.0300
0.0439 49.0 32683 0.3814 0.856 0.2176 1.4252 0.856 0.8568 0.0600 0.0300
0.0437 50.0 33350 0.3813 0.8558 0.2176 1.4251 0.8558 0.8566 0.0597 0.0299

Framework versions