<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
vit-base-patch16-224-Trial006-007-008-YEL_STEM1
This model is a fine-tuned version of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.0821
- Accuracy: 0.9758
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 60
- eval_batch_size: 60
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.7024 | 0.96 | 6 | 0.6971 | 0.5394 |
0.6702 | 1.92 | 12 | 0.6017 | 0.7455 |
0.5632 | 2.88 | 18 | 0.5534 | 0.7212 |
0.5239 | 4.0 | 25 | 0.4925 | 0.7455 |
0.4836 | 4.96 | 31 | 0.3925 | 0.8424 |
0.4564 | 5.92 | 37 | 0.4079 | 0.8182 |
0.5014 | 6.88 | 43 | 0.3821 | 0.8182 |
0.4027 | 8.0 | 50 | 0.2385 | 0.8909 |
0.3362 | 8.96 | 56 | 0.2489 | 0.8909 |
0.3615 | 9.92 | 62 | 0.2407 | 0.8970 |
0.2995 | 10.88 | 68 | 0.1514 | 0.9515 |
0.3112 | 12.0 | 75 | 0.1521 | 0.9515 |
0.328 | 12.96 | 81 | 0.1392 | 0.9576 |
0.3237 | 13.92 | 87 | 0.1252 | 0.9515 |
0.2535 | 14.88 | 93 | 0.1140 | 0.9576 |
0.2831 | 16.0 | 100 | 0.1292 | 0.9394 |
0.2868 | 16.96 | 106 | 0.1462 | 0.9273 |
0.2551 | 17.92 | 112 | 0.1176 | 0.9515 |
0.2793 | 18.88 | 118 | 0.1179 | 0.9515 |
0.2068 | 20.0 | 125 | 0.1068 | 0.9576 |
0.2553 | 20.96 | 131 | 0.0945 | 0.9697 |
0.2028 | 21.92 | 137 | 0.1020 | 0.9636 |
0.2227 | 22.88 | 143 | 0.1013 | 0.9576 |
0.2644 | 24.0 | 150 | 0.0980 | 0.9515 |
0.2228 | 24.96 | 156 | 0.1439 | 0.9394 |
0.2206 | 25.92 | 162 | 0.1079 | 0.9455 |
0.2258 | 26.88 | 168 | 0.0933 | 0.9636 |
0.2645 | 28.0 | 175 | 0.0821 | 0.9758 |
0.2056 | 28.96 | 181 | 0.0843 | 0.9697 |
0.2358 | 29.92 | 187 | 0.0750 | 0.9758 |
0.1906 | 30.88 | 193 | 0.0794 | 0.9697 |
0.1891 | 32.0 | 200 | 0.0755 | 0.9758 |
0.2124 | 32.96 | 206 | 0.0834 | 0.9515 |
0.1882 | 33.92 | 212 | 0.0855 | 0.9576 |
0.2064 | 34.88 | 218 | 0.0811 | 0.9636 |
0.2554 | 36.0 | 225 | 0.0845 | 0.9697 |
0.2363 | 36.96 | 231 | 0.0872 | 0.9697 |
0.2206 | 37.92 | 237 | 0.0895 | 0.9697 |
0.1868 | 38.88 | 243 | 0.0904 | 0.9697 |
0.1973 | 40.0 | 250 | 0.0877 | 0.9697 |
0.2084 | 40.96 | 256 | 0.0824 | 0.9636 |
0.2419 | 41.92 | 262 | 0.0826 | 0.9636 |
0.2466 | 42.88 | 268 | 0.0833 | 0.9636 |
0.1778 | 44.0 | 275 | 0.0803 | 0.9697 |
0.2065 | 44.96 | 281 | 0.0811 | 0.9697 |
0.2225 | 45.92 | 287 | 0.0806 | 0.9697 |
0.2209 | 46.88 | 293 | 0.0812 | 0.9697 |
0.2244 | 48.0 | 300 | 0.0824 | 0.9636 |
Framework versions
- Transformers 4.30.0.dev0
- Pytorch 1.12.1
- Datasets 2.12.0
- Tokenizers 0.13.1