<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
18-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6385
- Accuracy: 0.795
- Brier Loss: 0.4484
- Nll: 0.9250
- F1 Micro: 0.795
- F1 Macro: 0.7709
- Ece: 0.4225
- Aurc: 0.0567
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 7 | 1.8736 | 0.105 | 1.0144 | 8.6059 | 0.1050 | 0.0844 | 0.3169 | 0.8967 |
No log | 2.0 | 14 | 1.2559 | 0.155 | 0.8899 | 7.1587 | 0.155 | 0.1259 | 0.2459 | 0.7824 |
No log | 3.0 | 21 | 1.0441 | 0.33 | 0.8123 | 5.3633 | 0.33 | 0.2575 | 0.2995 | 0.5173 |
No log | 4.0 | 28 | 0.9169 | 0.525 | 0.6852 | 3.4671 | 0.525 | 0.4253 | 0.3387 | 0.2892 |
No log | 5.0 | 35 | 0.8589 | 0.615 | 0.6269 | 3.1119 | 0.615 | 0.5500 | 0.3683 | 0.2124 |
No log | 6.0 | 42 | 0.7954 | 0.675 | 0.5756 | 2.2578 | 0.675 | 0.5752 | 0.3626 | 0.1550 |
No log | 7.0 | 49 | 0.7664 | 0.685 | 0.5143 | 1.8811 | 0.685 | 0.6073 | 0.3254 | 0.1390 |
No log | 8.0 | 56 | 0.7305 | 0.76 | 0.4895 | 1.5449 | 0.76 | 0.6768 | 0.3695 | 0.1016 |
No log | 9.0 | 63 | 0.7056 | 0.765 | 0.4721 | 1.3575 | 0.765 | 0.6991 | 0.3828 | 0.0927 |
No log | 10.0 | 70 | 0.6961 | 0.77 | 0.4380 | 1.2662 | 0.7700 | 0.7509 | 0.3549 | 0.0803 |
No log | 11.0 | 77 | 0.6772 | 0.81 | 0.4508 | 1.3169 | 0.81 | 0.7915 | 0.4175 | 0.0629 |
No log | 12.0 | 84 | 0.6766 | 0.785 | 0.4491 | 1.2979 | 0.785 | 0.7650 | 0.3839 | 0.0800 |
No log | 13.0 | 91 | 0.6754 | 0.785 | 0.4382 | 1.2395 | 0.785 | 0.7794 | 0.3609 | 0.0689 |
No log | 14.0 | 98 | 0.6768 | 0.8 | 0.4472 | 1.2218 | 0.8000 | 0.7837 | 0.3910 | 0.0640 |
No log | 15.0 | 105 | 0.6793 | 0.81 | 0.4663 | 1.2698 | 0.81 | 0.7856 | 0.4293 | 0.0672 |
No log | 16.0 | 112 | 0.6784 | 0.795 | 0.4726 | 1.3043 | 0.795 | 0.7728 | 0.4232 | 0.0669 |
No log | 17.0 | 119 | 0.6638 | 0.805 | 0.4372 | 1.2746 | 0.805 | 0.7747 | 0.3956 | 0.0677 |
No log | 18.0 | 126 | 0.6588 | 0.8 | 0.4297 | 1.4466 | 0.8000 | 0.7762 | 0.3866 | 0.0686 |
No log | 19.0 | 133 | 0.6588 | 0.81 | 0.4588 | 1.2093 | 0.81 | 0.7912 | 0.4029 | 0.0702 |
No log | 20.0 | 140 | 0.6587 | 0.81 | 0.4534 | 1.0697 | 0.81 | 0.7980 | 0.4197 | 0.0641 |
No log | 21.0 | 147 | 0.6527 | 0.815 | 0.4529 | 1.1527 | 0.815 | 0.7942 | 0.4196 | 0.0598 |
No log | 22.0 | 154 | 0.6608 | 0.78 | 0.4559 | 1.2039 | 0.78 | 0.7581 | 0.3612 | 0.0725 |
No log | 23.0 | 161 | 0.6558 | 0.8 | 0.4547 | 1.0687 | 0.8000 | 0.7644 | 0.3964 | 0.0584 |
No log | 24.0 | 168 | 0.6584 | 0.8 | 0.4491 | 1.2869 | 0.8000 | 0.7735 | 0.3810 | 0.0687 |
No log | 25.0 | 175 | 0.6493 | 0.805 | 0.4497 | 0.9981 | 0.805 | 0.7887 | 0.4162 | 0.0570 |
No log | 26.0 | 182 | 0.6425 | 0.795 | 0.4424 | 1.1317 | 0.795 | 0.7790 | 0.3974 | 0.0596 |
No log | 27.0 | 189 | 0.6518 | 0.8 | 0.4552 | 0.9743 | 0.8000 | 0.7715 | 0.4122 | 0.0592 |
No log | 28.0 | 196 | 0.6526 | 0.805 | 0.4630 | 1.1343 | 0.805 | 0.7941 | 0.4171 | 0.0672 |
No log | 29.0 | 203 | 0.6515 | 0.8 | 0.4531 | 1.0062 | 0.8000 | 0.7681 | 0.3970 | 0.0566 |
No log | 30.0 | 210 | 0.6459 | 0.795 | 0.4534 | 1.0893 | 0.795 | 0.7853 | 0.3972 | 0.0600 |
No log | 31.0 | 217 | 0.6423 | 0.81 | 0.4483 | 0.9035 | 0.81 | 0.7927 | 0.4297 | 0.0536 |
No log | 32.0 | 224 | 0.6454 | 0.8 | 0.4517 | 1.1025 | 0.8000 | 0.7688 | 0.3923 | 0.0599 |
No log | 33.0 | 231 | 0.6417 | 0.805 | 0.4476 | 0.9658 | 0.805 | 0.7767 | 0.4136 | 0.0563 |
No log | 34.0 | 238 | 0.6399 | 0.815 | 0.4462 | 0.8565 | 0.815 | 0.7940 | 0.4234 | 0.0550 |
No log | 35.0 | 245 | 0.6430 | 0.81 | 0.4505 | 1.0491 | 0.81 | 0.7855 | 0.4279 | 0.0629 |
No log | 36.0 | 252 | 0.6440 | 0.815 | 0.4481 | 1.0288 | 0.815 | 0.7813 | 0.4132 | 0.0539 |
No log | 37.0 | 259 | 0.6396 | 0.82 | 0.4493 | 0.9477 | 0.82 | 0.8125 | 0.4266 | 0.0525 |
No log | 38.0 | 266 | 0.6410 | 0.815 | 0.4462 | 1.0462 | 0.815 | 0.7971 | 0.4157 | 0.0522 |
No log | 39.0 | 273 | 0.6360 | 0.8 | 0.4399 | 0.9645 | 0.8000 | 0.7779 | 0.3974 | 0.0566 |
No log | 40.0 | 280 | 0.6376 | 0.805 | 0.4412 | 0.8777 | 0.805 | 0.7772 | 0.4104 | 0.0544 |
No log | 41.0 | 287 | 0.6411 | 0.795 | 0.4475 | 0.9240 | 0.795 | 0.7780 | 0.4062 | 0.0583 |
No log | 42.0 | 294 | 0.6398 | 0.795 | 0.4509 | 0.9279 | 0.795 | 0.7650 | 0.4068 | 0.0577 |
No log | 43.0 | 301 | 0.6430 | 0.79 | 0.4567 | 0.9279 | 0.79 | 0.7683 | 0.4073 | 0.0590 |
No log | 44.0 | 308 | 0.6401 | 0.8 | 0.4495 | 0.9915 | 0.8000 | 0.7744 | 0.4200 | 0.0565 |
No log | 45.0 | 315 | 0.6364 | 0.795 | 0.4448 | 0.9245 | 0.795 | 0.7729 | 0.4115 | 0.0568 |
No log | 46.0 | 322 | 0.6391 | 0.79 | 0.4472 | 1.0060 | 0.79 | 0.7633 | 0.4044 | 0.0561 |
No log | 47.0 | 329 | 0.6376 | 0.795 | 0.4470 | 0.9530 | 0.795 | 0.7693 | 0.3989 | 0.0578 |
No log | 48.0 | 336 | 0.6383 | 0.8 | 0.4476 | 0.9992 | 0.8000 | 0.7804 | 0.4084 | 0.0579 |
No log | 49.0 | 343 | 0.6353 | 0.8 | 0.4424 | 0.8500 | 0.8000 | 0.7756 | 0.4055 | 0.0546 |
No log | 50.0 | 350 | 0.6381 | 0.795 | 0.4470 | 0.9931 | 0.795 | 0.7691 | 0.4170 | 0.0573 |
No log | 51.0 | 357 | 0.6374 | 0.795 | 0.4477 | 0.9729 | 0.795 | 0.7630 | 0.4076 | 0.0563 |
No log | 52.0 | 364 | 0.6377 | 0.8 | 0.4481 | 0.9846 | 0.8000 | 0.7759 | 0.4212 | 0.0555 |
No log | 53.0 | 371 | 0.6378 | 0.795 | 0.4485 | 0.9379 | 0.795 | 0.7733 | 0.4052 | 0.0565 |
No log | 54.0 | 378 | 0.6385 | 0.79 | 0.4477 | 0.9900 | 0.79 | 0.7684 | 0.4165 | 0.0571 |
No log | 55.0 | 385 | 0.6371 | 0.81 | 0.4466 | 0.9178 | 0.81 | 0.7867 | 0.4149 | 0.0546 |
No log | 56.0 | 392 | 0.6373 | 0.795 | 0.4460 | 0.9254 | 0.795 | 0.7692 | 0.4081 | 0.0568 |
No log | 57.0 | 399 | 0.6376 | 0.79 | 0.4476 | 0.9194 | 0.79 | 0.7596 | 0.3996 | 0.0568 |
No log | 58.0 | 406 | 0.6380 | 0.79 | 0.4477 | 0.9259 | 0.79 | 0.7619 | 0.4024 | 0.0575 |
No log | 59.0 | 413 | 0.6377 | 0.8 | 0.4474 | 0.9100 | 0.8000 | 0.7806 | 0.4096 | 0.0569 |
No log | 60.0 | 420 | 0.6378 | 0.8 | 0.4481 | 0.9189 | 0.8000 | 0.7806 | 0.4076 | 0.0566 |
No log | 61.0 | 427 | 0.6378 | 0.795 | 0.4478 | 0.9860 | 0.795 | 0.7709 | 0.3994 | 0.0566 |
No log | 62.0 | 434 | 0.6380 | 0.795 | 0.4480 | 0.9189 | 0.795 | 0.7692 | 0.4070 | 0.0564 |
No log | 63.0 | 441 | 0.6381 | 0.8 | 0.4482 | 0.9195 | 0.8000 | 0.7806 | 0.4047 | 0.0568 |
No log | 64.0 | 448 | 0.6379 | 0.8 | 0.4480 | 0.9223 | 0.8000 | 0.7806 | 0.4224 | 0.0563 |
No log | 65.0 | 455 | 0.6382 | 0.8 | 0.4481 | 0.9196 | 0.8000 | 0.7806 | 0.4113 | 0.0569 |
No log | 66.0 | 462 | 0.6381 | 0.8 | 0.4484 | 0.9200 | 0.8000 | 0.7806 | 0.4308 | 0.0566 |
No log | 67.0 | 469 | 0.6379 | 0.8 | 0.4479 | 0.9198 | 0.8000 | 0.7806 | 0.4186 | 0.0566 |
No log | 68.0 | 476 | 0.6378 | 0.8 | 0.4476 | 0.9167 | 0.8000 | 0.7806 | 0.4166 | 0.0569 |
No log | 69.0 | 483 | 0.6380 | 0.8 | 0.4481 | 0.9179 | 0.8000 | 0.7806 | 0.4254 | 0.0566 |
No log | 70.0 | 490 | 0.6384 | 0.795 | 0.4486 | 0.9225 | 0.795 | 0.7709 | 0.4158 | 0.0566 |
No log | 71.0 | 497 | 0.6380 | 0.795 | 0.4476 | 0.9211 | 0.795 | 0.7709 | 0.4215 | 0.0568 |
0.5133 | 72.0 | 504 | 0.6381 | 0.795 | 0.4480 | 0.9232 | 0.795 | 0.7709 | 0.4151 | 0.0566 |
0.5133 | 73.0 | 511 | 0.6380 | 0.795 | 0.4479 | 0.9242 | 0.795 | 0.7709 | 0.4218 | 0.0564 |
0.5133 | 74.0 | 518 | 0.6380 | 0.795 | 0.4478 | 0.9231 | 0.795 | 0.7709 | 0.4151 | 0.0566 |
0.5133 | 75.0 | 525 | 0.6382 | 0.795 | 0.4484 | 0.9245 | 0.795 | 0.7709 | 0.4156 | 0.0565 |
0.5133 | 76.0 | 532 | 0.6382 | 0.795 | 0.4481 | 0.9216 | 0.795 | 0.7709 | 0.4153 | 0.0567 |
0.5133 | 77.0 | 539 | 0.6382 | 0.795 | 0.4481 | 0.9231 | 0.795 | 0.7709 | 0.4222 | 0.0567 |
0.5133 | 78.0 | 546 | 0.6382 | 0.795 | 0.4481 | 0.9210 | 0.795 | 0.7709 | 0.4220 | 0.0565 |
0.5133 | 79.0 | 553 | 0.6382 | 0.795 | 0.4480 | 0.9220 | 0.795 | 0.7709 | 0.4220 | 0.0565 |
0.5133 | 80.0 | 560 | 0.6384 | 0.795 | 0.4484 | 0.9220 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 81.0 | 567 | 0.6383 | 0.795 | 0.4483 | 0.9218 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 82.0 | 574 | 0.6382 | 0.795 | 0.4480 | 0.9220 | 0.795 | 0.7709 | 0.4221 | 0.0568 |
0.5133 | 83.0 | 581 | 0.6384 | 0.795 | 0.4484 | 0.9240 | 0.795 | 0.7709 | 0.4157 | 0.0566 |
0.5133 | 84.0 | 588 | 0.6384 | 0.795 | 0.4484 | 0.9262 | 0.795 | 0.7709 | 0.4224 | 0.0566 |
0.5133 | 85.0 | 595 | 0.6382 | 0.795 | 0.4481 | 0.9235 | 0.795 | 0.7709 | 0.4221 | 0.0566 |
0.5133 | 86.0 | 602 | 0.6384 | 0.795 | 0.4484 | 0.9236 | 0.795 | 0.7709 | 0.4225 | 0.0566 |
0.5133 | 87.0 | 609 | 0.6384 | 0.795 | 0.4484 | 0.9235 | 0.795 | 0.7709 | 0.4225 | 0.0567 |
0.5133 | 88.0 | 616 | 0.6384 | 0.795 | 0.4483 | 0.9250 | 0.795 | 0.7709 | 0.4224 | 0.0566 |
0.5133 | 89.0 | 623 | 0.6384 | 0.795 | 0.4483 | 0.9244 | 0.795 | 0.7709 | 0.4223 | 0.0567 |
0.5133 | 90.0 | 630 | 0.6384 | 0.795 | 0.4483 | 0.9251 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 91.0 | 637 | 0.6384 | 0.795 | 0.4484 | 0.9246 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 92.0 | 644 | 0.6384 | 0.795 | 0.4484 | 0.9256 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 93.0 | 651 | 0.6385 | 0.795 | 0.4484 | 0.9252 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 94.0 | 658 | 0.6384 | 0.795 | 0.4484 | 0.9245 | 0.795 | 0.7709 | 0.4223 | 0.0565 |
0.5133 | 95.0 | 665 | 0.6385 | 0.795 | 0.4484 | 0.9254 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 96.0 | 672 | 0.6384 | 0.795 | 0.4484 | 0.9242 | 0.795 | 0.7709 | 0.4225 | 0.0566 |
0.5133 | 97.0 | 679 | 0.6384 | 0.795 | 0.4484 | 0.9242 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 98.0 | 686 | 0.6385 | 0.795 | 0.4484 | 0.9249 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
0.5133 | 99.0 | 693 | 0.6385 | 0.795 | 0.4484 | 0.9252 | 0.795 | 0.7709 | 0.4224 | 0.0566 |
0.5133 | 100.0 | 700 | 0.6385 | 0.795 | 0.4484 | 0.9250 | 0.795 | 0.7709 | 0.4225 | 0.0567 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2