<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
plant-seedlings-freeze-0-6-aug-3-all-train
This model is a fine-tuned version of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.2362
- Accuracy: 0.9454
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 25
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.1589 | 0.25 | 100 | 0.3561 | 0.9141 |
0.1629 | 0.49 | 200 | 0.3792 | 0.8932 |
0.1222 | 0.74 | 300 | 0.3749 | 0.8975 |
0.1363 | 0.98 | 400 | 0.3021 | 0.9122 |
0.0699 | 1.23 | 500 | 0.4802 | 0.8883 |
0.0759 | 1.47 | 600 | 0.3129 | 0.9214 |
0.2157 | 1.72 | 700 | 0.3362 | 0.9159 |
0.1148 | 1.97 | 800 | 0.2898 | 0.9122 |
0.3137 | 2.21 | 900 | 0.4267 | 0.8969 |
0.072 | 2.46 | 1000 | 0.3180 | 0.9141 |
0.0775 | 2.7 | 1100 | 0.3856 | 0.9067 |
0.1019 | 2.95 | 1200 | 0.3182 | 0.9153 |
0.1119 | 3.19 | 1300 | 0.4458 | 0.8944 |
0.1342 | 3.44 | 1400 | 0.4718 | 0.8889 |
0.1658 | 3.69 | 1500 | 0.3697 | 0.9012 |
0.1609 | 3.93 | 1600 | 0.4079 | 0.9024 |
0.1223 | 4.18 | 1700 | 0.3688 | 0.9147 |
0.1821 | 4.42 | 1800 | 0.3392 | 0.9116 |
0.0901 | 4.67 | 1900 | 0.3726 | 0.8969 |
0.0857 | 4.91 | 2000 | 0.3158 | 0.9196 |
0.1245 | 5.16 | 2100 | 0.3503 | 0.9122 |
0.133 | 5.41 | 2200 | 0.3712 | 0.9134 |
0.171 | 5.65 | 2300 | 0.3543 | 0.9067 |
0.1222 | 5.9 | 2400 | 0.3031 | 0.9227 |
0.1504 | 6.14 | 2500 | 0.3356 | 0.9085 |
0.0889 | 6.39 | 2600 | 0.3695 | 0.9116 |
0.0185 | 6.63 | 2700 | 0.3509 | 0.9141 |
0.1201 | 6.88 | 2800 | 0.3330 | 0.9177 |
0.0766 | 7.13 | 2900 | 0.2718 | 0.9251 |
0.0998 | 7.37 | 3000 | 0.3471 | 0.9233 |
0.1654 | 7.62 | 3100 | 0.3285 | 0.9196 |
0.0529 | 7.86 | 3200 | 0.3394 | 0.9190 |
0.1199 | 8.11 | 3300 | 0.2968 | 0.9294 |
0.0338 | 8.35 | 3400 | 0.2784 | 0.9251 |
0.124 | 8.6 | 3500 | 0.3099 | 0.9251 |
0.0581 | 8.85 | 3600 | 0.3372 | 0.9263 |
0.1776 | 9.09 | 3700 | 0.3580 | 0.9134 |
0.1598 | 9.34 | 3800 | 0.3158 | 0.9196 |
0.1122 | 9.58 | 3900 | 0.3369 | 0.9190 |
0.0808 | 9.83 | 4000 | 0.3259 | 0.9368 |
0.1086 | 10.07 | 4100 | 0.3691 | 0.9190 |
0.0197 | 10.32 | 4200 | 0.3101 | 0.9355 |
0.065 | 10.57 | 4300 | 0.3479 | 0.9227 |
0.1183 | 10.81 | 4400 | 0.3281 | 0.9319 |
0.044 | 11.06 | 4500 | 0.4357 | 0.9134 |
0.1021 | 11.3 | 4600 | 0.3211 | 0.9337 |
0.0615 | 11.55 | 4700 | 0.2947 | 0.9398 |
0.0664 | 11.79 | 4800 | 0.4421 | 0.9184 |
0.092 | 12.04 | 4900 | 0.3333 | 0.9202 |
0.1544 | 12.29 | 5000 | 0.3062 | 0.9245 |
0.1324 | 12.53 | 5100 | 0.2756 | 0.9294 |
0.1132 | 12.78 | 5200 | 0.2570 | 0.9362 |
0.0899 | 13.02 | 5300 | 0.2486 | 0.9386 |
0.0712 | 13.27 | 5400 | 0.2878 | 0.9306 |
0.0411 | 13.51 | 5500 | 0.2663 | 0.9368 |
0.0559 | 13.76 | 5600 | 0.2751 | 0.9355 |
0.0928 | 14.0 | 5700 | 0.3093 | 0.9269 |
0.0504 | 14.25 | 5800 | 0.2954 | 0.9319 |
0.0995 | 14.5 | 5900 | 0.2636 | 0.9337 |
0.1139 | 14.74 | 6000 | 0.2827 | 0.9349 |
0.0992 | 14.99 | 6100 | 0.2662 | 0.9368 |
0.1519 | 15.23 | 6200 | 0.2720 | 0.9398 |
0.0192 | 15.48 | 6300 | 0.3252 | 0.9269 |
0.0592 | 15.72 | 6400 | 0.3382 | 0.9263 |
0.0382 | 15.97 | 6500 | 0.2710 | 0.9349 |
0.0723 | 16.22 | 6600 | 0.2671 | 0.9374 |
0.0073 | 16.46 | 6700 | 0.3451 | 0.9263 |
0.1796 | 16.71 | 6800 | 0.3196 | 0.9196 |
0.0919 | 16.95 | 6900 | 0.2464 | 0.9337 |
0.0739 | 17.2 | 7000 | 0.2258 | 0.9392 |
0.0468 | 17.44 | 7100 | 0.2483 | 0.9411 |
0.145 | 17.69 | 7200 | 0.2639 | 0.9312 |
0.0243 | 17.94 | 7300 | 0.2574 | 0.9362 |
0.0648 | 18.18 | 7400 | 0.2554 | 0.9331 |
0.0508 | 18.43 | 7500 | 0.2554 | 0.9374 |
0.0475 | 18.67 | 7600 | 0.2915 | 0.9337 |
0.0708 | 18.92 | 7700 | 0.2801 | 0.9300 |
0.1476 | 19.16 | 7800 | 0.2479 | 0.9411 |
0.1535 | 19.41 | 7900 | 0.2412 | 0.9411 |
0.0873 | 19.66 | 8000 | 0.2544 | 0.9398 |
0.0416 | 19.9 | 8100 | 0.2334 | 0.9423 |
0.1157 | 20.15 | 8200 | 0.2059 | 0.9540 |
0.039 | 20.39 | 8300 | 0.2601 | 0.9362 |
0.0223 | 20.64 | 8400 | 0.2234 | 0.9484 |
0.0779 | 20.88 | 8500 | 0.2468 | 0.9405 |
0.0604 | 21.13 | 8600 | 0.2334 | 0.9374 |
0.1206 | 21.38 | 8700 | 0.2504 | 0.9398 |
0.0738 | 21.62 | 8800 | 0.2505 | 0.9398 |
0.0438 | 21.87 | 8900 | 0.2148 | 0.9472 |
0.0689 | 22.11 | 9000 | 0.2286 | 0.9435 |
0.0505 | 22.36 | 9100 | 0.1956 | 0.9472 |
0.0581 | 22.6 | 9200 | 0.2104 | 0.9484 |
0.1575 | 22.85 | 9300 | 0.2309 | 0.9441 |
0.048 | 23.1 | 9400 | 0.2685 | 0.9398 |
0.0784 | 23.34 | 9500 | 0.2329 | 0.9454 |
0.0771 | 23.59 | 9600 | 0.2294 | 0.9466 |
0.0545 | 23.83 | 9700 | 0.2037 | 0.9484 |
0.0481 | 24.08 | 9800 | 0.1994 | 0.9540 |
0.0663 | 24.32 | 9900 | 0.1993 | 0.9490 |
0.0921 | 24.57 | 10000 | 0.2204 | 0.9521 |
0.0939 | 24.82 | 10100 | 0.2362 | 0.9454 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3