<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
60-tiny_tobacco3482
This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1098
- Accuracy: 0.79
- Brier Loss: 0.4604
- Nll: 0.9058
- F1 Micro: 0.79
- F1 Macro: 0.7539
- Ece: 0.4083
- Aurc: 0.0644
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 7 | 1.9988 | 0.16 | 1.0265 | 8.4697 | 0.16 | 0.0543 | 0.3527 | 0.8493 |
No log | 2.0 | 14 | 0.9241 | 0.1 | 0.9095 | 7.2634 | 0.1000 | 0.0853 | 0.2311 | 0.8707 |
No log | 3.0 | 21 | 0.5325 | 0.29 | 0.8525 | 5.4845 | 0.29 | 0.1287 | 0.2809 | 0.6344 |
No log | 4.0 | 28 | 0.4050 | 0.36 | 0.7891 | 3.8464 | 0.36 | 0.2714 | 0.2948 | 0.4598 |
No log | 5.0 | 35 | 0.3282 | 0.495 | 0.7189 | 2.5837 | 0.495 | 0.3741 | 0.3317 | 0.3367 |
No log | 6.0 | 42 | 0.2722 | 0.585 | 0.6827 | 2.2625 | 0.585 | 0.4818 | 0.3751 | 0.2241 |
No log | 7.0 | 49 | 0.2426 | 0.605 | 0.6424 | 2.0945 | 0.605 | 0.4641 | 0.3516 | 0.2034 |
No log | 8.0 | 56 | 0.2223 | 0.685 | 0.6171 | 1.9679 | 0.685 | 0.5711 | 0.4148 | 0.1545 |
No log | 9.0 | 63 | 0.1939 | 0.67 | 0.5684 | 1.8320 | 0.67 | 0.5495 | 0.3460 | 0.1438 |
No log | 10.0 | 70 | 0.1950 | 0.695 | 0.5454 | 1.3666 | 0.695 | 0.5750 | 0.3667 | 0.1276 |
No log | 11.0 | 77 | 0.1838 | 0.69 | 0.5436 | 1.4378 | 0.69 | 0.5745 | 0.3846 | 0.1162 |
No log | 12.0 | 84 | 0.1787 | 0.735 | 0.5138 | 1.4550 | 0.735 | 0.6490 | 0.3893 | 0.0949 |
No log | 13.0 | 91 | 0.1682 | 0.74 | 0.5209 | 1.6406 | 0.74 | 0.6452 | 0.3904 | 0.1052 |
No log | 14.0 | 98 | 0.1734 | 0.75 | 0.5420 | 1.4280 | 0.75 | 0.6656 | 0.4440 | 0.0864 |
No log | 15.0 | 105 | 0.1404 | 0.725 | 0.4961 | 1.3478 | 0.7250 | 0.6343 | 0.3816 | 0.0895 |
No log | 16.0 | 112 | 0.1439 | 0.76 | 0.4709 | 1.4366 | 0.76 | 0.6679 | 0.3721 | 0.0754 |
No log | 17.0 | 119 | 0.1356 | 0.74 | 0.4745 | 1.3609 | 0.74 | 0.6596 | 0.3643 | 0.0868 |
No log | 18.0 | 126 | 0.1373 | 0.75 | 0.4760 | 1.4421 | 0.75 | 0.6703 | 0.3773 | 0.0783 |
No log | 19.0 | 133 | 0.1352 | 0.765 | 0.4851 | 1.1693 | 0.765 | 0.6803 | 0.3952 | 0.0693 |
No log | 20.0 | 140 | 0.1323 | 0.765 | 0.4838 | 1.0026 | 0.765 | 0.6844 | 0.3866 | 0.0741 |
No log | 21.0 | 147 | 0.1334 | 0.785 | 0.4713 | 1.1479 | 0.785 | 0.7430 | 0.4105 | 0.0737 |
No log | 22.0 | 154 | 0.1267 | 0.775 | 0.4706 | 1.1082 | 0.775 | 0.7355 | 0.3838 | 0.0727 |
No log | 23.0 | 161 | 0.1279 | 0.77 | 0.4598 | 1.0869 | 0.7700 | 0.7254 | 0.3846 | 0.0732 |
No log | 24.0 | 168 | 0.1229 | 0.805 | 0.4838 | 1.0060 | 0.805 | 0.7635 | 0.4268 | 0.0625 |
No log | 25.0 | 175 | 0.1250 | 0.79 | 0.4740 | 0.9769 | 0.79 | 0.7462 | 0.3898 | 0.0684 |
No log | 26.0 | 182 | 0.1371 | 0.795 | 0.4784 | 1.1316 | 0.795 | 0.7641 | 0.4246 | 0.0732 |
No log | 27.0 | 189 | 0.1230 | 0.77 | 0.4625 | 0.9606 | 0.7700 | 0.7185 | 0.3816 | 0.0712 |
No log | 28.0 | 196 | 0.1161 | 0.775 | 0.4661 | 0.9889 | 0.775 | 0.7375 | 0.3925 | 0.0658 |
No log | 29.0 | 203 | 0.1194 | 0.775 | 0.4688 | 1.0280 | 0.775 | 0.7320 | 0.4087 | 0.0709 |
No log | 30.0 | 210 | 0.1211 | 0.795 | 0.4680 | 1.0785 | 0.795 | 0.7677 | 0.4168 | 0.0671 |
No log | 31.0 | 217 | 0.1208 | 0.79 | 0.4629 | 0.9986 | 0.79 | 0.7536 | 0.3892 | 0.0658 |
No log | 32.0 | 224 | 0.1194 | 0.77 | 0.4588 | 0.9202 | 0.7700 | 0.7313 | 0.3791 | 0.0679 |
No log | 33.0 | 231 | 0.1167 | 0.795 | 0.4567 | 0.9374 | 0.795 | 0.7602 | 0.3852 | 0.0668 |
No log | 34.0 | 238 | 0.1205 | 0.77 | 0.4653 | 0.9700 | 0.7700 | 0.7291 | 0.3829 | 0.0721 |
No log | 35.0 | 245 | 0.1179 | 0.77 | 0.4616 | 0.9313 | 0.7700 | 0.7366 | 0.3797 | 0.0724 |
No log | 36.0 | 252 | 0.1155 | 0.78 | 0.4566 | 0.9870 | 0.78 | 0.7391 | 0.3718 | 0.0661 |
No log | 37.0 | 259 | 0.1151 | 0.785 | 0.4614 | 0.8936 | 0.785 | 0.7455 | 0.4010 | 0.0684 |
No log | 38.0 | 266 | 0.1126 | 0.78 | 0.4588 | 0.9190 | 0.78 | 0.7406 | 0.3874 | 0.0669 |
No log | 39.0 | 273 | 0.1139 | 0.78 | 0.4637 | 0.9150 | 0.78 | 0.7408 | 0.3874 | 0.0708 |
No log | 40.0 | 280 | 0.1138 | 0.785 | 0.4650 | 0.9096 | 0.785 | 0.7500 | 0.4002 | 0.0680 |
No log | 41.0 | 287 | 0.1139 | 0.79 | 0.4644 | 0.9092 | 0.79 | 0.7590 | 0.4034 | 0.0668 |
No log | 42.0 | 294 | 0.1140 | 0.79 | 0.4670 | 0.9062 | 0.79 | 0.7494 | 0.4042 | 0.0665 |
No log | 43.0 | 301 | 0.1126 | 0.785 | 0.4623 | 0.8470 | 0.785 | 0.7502 | 0.4079 | 0.0660 |
No log | 44.0 | 308 | 0.1146 | 0.775 | 0.4651 | 0.9065 | 0.775 | 0.7314 | 0.3907 | 0.0721 |
No log | 45.0 | 315 | 0.1122 | 0.785 | 0.4643 | 0.8626 | 0.785 | 0.7443 | 0.3991 | 0.0639 |
No log | 46.0 | 322 | 0.1109 | 0.795 | 0.4644 | 0.9087 | 0.795 | 0.7631 | 0.4017 | 0.0629 |
No log | 47.0 | 329 | 0.1116 | 0.79 | 0.4640 | 0.8473 | 0.79 | 0.7584 | 0.3959 | 0.0634 |
No log | 48.0 | 336 | 0.1147 | 0.78 | 0.4662 | 0.8717 | 0.78 | 0.7467 | 0.3859 | 0.0677 |
No log | 49.0 | 343 | 0.1154 | 0.765 | 0.4586 | 1.0035 | 0.765 | 0.7366 | 0.3826 | 0.0764 |
No log | 50.0 | 350 | 0.1112 | 0.79 | 0.4582 | 0.9230 | 0.79 | 0.7525 | 0.3854 | 0.0672 |
No log | 51.0 | 357 | 0.1104 | 0.79 | 0.4633 | 0.9120 | 0.79 | 0.7598 | 0.4000 | 0.0667 |
No log | 52.0 | 364 | 0.1115 | 0.79 | 0.4641 | 0.8550 | 0.79 | 0.7603 | 0.3914 | 0.0672 |
No log | 53.0 | 371 | 0.1150 | 0.77 | 0.4613 | 0.9215 | 0.7700 | 0.7333 | 0.3882 | 0.0733 |
No log | 54.0 | 378 | 0.1100 | 0.8 | 0.4596 | 0.9149 | 0.8000 | 0.7610 | 0.4055 | 0.0657 |
No log | 55.0 | 385 | 0.1094 | 0.785 | 0.4613 | 0.9060 | 0.785 | 0.7506 | 0.3956 | 0.0664 |
No log | 56.0 | 392 | 0.1087 | 0.785 | 0.4607 | 0.9068 | 0.785 | 0.7498 | 0.3984 | 0.0649 |
No log | 57.0 | 399 | 0.1094 | 0.785 | 0.4630 | 0.8993 | 0.785 | 0.7491 | 0.3943 | 0.0674 |
No log | 58.0 | 406 | 0.1100 | 0.805 | 0.4627 | 0.9130 | 0.805 | 0.7693 | 0.4018 | 0.0637 |
No log | 59.0 | 413 | 0.1103 | 0.795 | 0.4619 | 0.8483 | 0.795 | 0.7618 | 0.3992 | 0.0632 |
No log | 60.0 | 420 | 0.1093 | 0.79 | 0.4631 | 0.9007 | 0.79 | 0.7539 | 0.3936 | 0.0647 |
No log | 61.0 | 427 | 0.1095 | 0.79 | 0.4594 | 0.9073 | 0.79 | 0.7539 | 0.4129 | 0.0654 |
No log | 62.0 | 434 | 0.1092 | 0.79 | 0.4591 | 0.9087 | 0.79 | 0.7539 | 0.3956 | 0.0638 |
No log | 63.0 | 441 | 0.1096 | 0.79 | 0.4611 | 0.9075 | 0.79 | 0.7539 | 0.4088 | 0.0654 |
No log | 64.0 | 448 | 0.1093 | 0.79 | 0.4610 | 0.9041 | 0.79 | 0.7539 | 0.3953 | 0.0650 |
No log | 65.0 | 455 | 0.1092 | 0.79 | 0.4602 | 0.9049 | 0.79 | 0.7539 | 0.3845 | 0.0642 |
No log | 66.0 | 462 | 0.1092 | 0.79 | 0.4605 | 0.9027 | 0.79 | 0.7539 | 0.3870 | 0.0646 |
No log | 67.0 | 469 | 0.1094 | 0.79 | 0.4610 | 0.9047 | 0.79 | 0.7539 | 0.3967 | 0.0643 |
No log | 68.0 | 476 | 0.1094 | 0.79 | 0.4602 | 0.9053 | 0.79 | 0.7539 | 0.3956 | 0.0645 |
No log | 69.0 | 483 | 0.1094 | 0.79 | 0.4599 | 0.9054 | 0.79 | 0.7539 | 0.3950 | 0.0646 |
No log | 70.0 | 490 | 0.1095 | 0.79 | 0.4609 | 0.9036 | 0.79 | 0.7539 | 0.4054 | 0.0647 |
No log | 71.0 | 497 | 0.1095 | 0.79 | 0.4601 | 0.9066 | 0.79 | 0.7539 | 0.3937 | 0.0646 |
0.1361 | 72.0 | 504 | 0.1095 | 0.79 | 0.4602 | 0.9045 | 0.79 | 0.7539 | 0.3958 | 0.0644 |
0.1361 | 73.0 | 511 | 0.1095 | 0.79 | 0.4605 | 0.9064 | 0.79 | 0.7539 | 0.3900 | 0.0645 |
0.1361 | 74.0 | 518 | 0.1095 | 0.79 | 0.4606 | 0.9037 | 0.79 | 0.7539 | 0.4116 | 0.0642 |
0.1361 | 75.0 | 525 | 0.1095 | 0.79 | 0.4606 | 0.9072 | 0.79 | 0.7539 | 0.4079 | 0.0646 |
0.1361 | 76.0 | 532 | 0.1096 | 0.79 | 0.4604 | 0.9066 | 0.79 | 0.7539 | 0.4017 | 0.0644 |
0.1361 | 77.0 | 539 | 0.1095 | 0.79 | 0.4603 | 0.9062 | 0.79 | 0.7539 | 0.4014 | 0.0646 |
0.1361 | 78.0 | 546 | 0.1096 | 0.79 | 0.4600 | 0.9053 | 0.79 | 0.7539 | 0.3957 | 0.0644 |
0.1361 | 79.0 | 553 | 0.1096 | 0.79 | 0.4606 | 0.9056 | 0.79 | 0.7539 | 0.3986 | 0.0645 |
0.1361 | 80.0 | 560 | 0.1097 | 0.79 | 0.4602 | 0.9059 | 0.79 | 0.7539 | 0.4023 | 0.0647 |
0.1361 | 81.0 | 567 | 0.1096 | 0.79 | 0.4604 | 0.9056 | 0.79 | 0.7539 | 0.4042 | 0.0645 |
0.1361 | 82.0 | 574 | 0.1097 | 0.79 | 0.4603 | 0.9058 | 0.79 | 0.7539 | 0.4082 | 0.0646 |
0.1361 | 83.0 | 581 | 0.1097 | 0.79 | 0.4606 | 0.9066 | 0.79 | 0.7539 | 0.4085 | 0.0645 |
0.1361 | 84.0 | 588 | 0.1097 | 0.79 | 0.4603 | 0.9060 | 0.79 | 0.7539 | 0.4040 | 0.0645 |
0.1361 | 85.0 | 595 | 0.1097 | 0.79 | 0.4606 | 0.9059 | 0.79 | 0.7539 | 0.3949 | 0.0645 |
0.1361 | 86.0 | 602 | 0.1097 | 0.79 | 0.4603 | 0.9059 | 0.79 | 0.7539 | 0.4040 | 0.0645 |
0.1361 | 87.0 | 609 | 0.1097 | 0.79 | 0.4605 | 0.9051 | 0.79 | 0.7539 | 0.4025 | 0.0644 |
0.1361 | 88.0 | 616 | 0.1097 | 0.79 | 0.4605 | 0.9055 | 0.79 | 0.7539 | 0.3962 | 0.0643 |
0.1361 | 89.0 | 623 | 0.1097 | 0.79 | 0.4603 | 0.9056 | 0.79 | 0.7539 | 0.4040 | 0.0643 |
0.1361 | 90.0 | 630 | 0.1098 | 0.79 | 0.4604 | 0.9051 | 0.79 | 0.7539 | 0.3962 | 0.0643 |
0.1361 | 91.0 | 637 | 0.1098 | 0.79 | 0.4604 | 0.9064 | 0.79 | 0.7539 | 0.4041 | 0.0644 |
0.1361 | 92.0 | 644 | 0.1098 | 0.79 | 0.4605 | 0.9055 | 0.79 | 0.7539 | 0.4004 | 0.0644 |
0.1361 | 93.0 | 651 | 0.1098 | 0.79 | 0.4605 | 0.9059 | 0.79 | 0.7539 | 0.4042 | 0.0644 |
0.1361 | 94.0 | 658 | 0.1098 | 0.79 | 0.4603 | 0.9059 | 0.79 | 0.7539 | 0.4094 | 0.0643 |
0.1361 | 95.0 | 665 | 0.1098 | 0.79 | 0.4605 | 0.9056 | 0.79 | 0.7539 | 0.4138 | 0.0645 |
0.1361 | 96.0 | 672 | 0.1098 | 0.79 | 0.4604 | 0.9059 | 0.79 | 0.7539 | 0.4095 | 0.0643 |
0.1361 | 97.0 | 679 | 0.1098 | 0.79 | 0.4604 | 0.9057 | 0.79 | 0.7539 | 0.4137 | 0.0643 |
0.1361 | 98.0 | 686 | 0.1098 | 0.79 | 0.4604 | 0.9059 | 0.79 | 0.7539 | 0.4096 | 0.0643 |
0.1361 | 99.0 | 693 | 0.1098 | 0.79 | 0.4604 | 0.9059 | 0.79 | 0.7539 | 0.4137 | 0.0644 |
0.1361 | 100.0 | 700 | 0.1098 | 0.79 | 0.4604 | 0.9058 | 0.79 | 0.7539 | 0.4083 | 0.0644 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2