<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
plant-seedlings-model-resnet-152-2
This model is a fine-tuned version of microsoft/resnet-152 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.2242
- Accuracy: 0.9381
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
2.0105 | 0.2 | 100 | 1.8953 | 0.4062 |
0.995 | 0.39 | 200 | 1.0372 | 0.6685 |
0.9354 | 0.59 | 300 | 0.7713 | 0.7461 |
0.6444 | 0.79 | 400 | 0.6037 | 0.8026 |
0.6477 | 0.98 | 500 | 0.5981 | 0.7991 |
0.6551 | 1.18 | 600 | 0.5224 | 0.8310 |
0.6466 | 1.38 | 700 | 0.5216 | 0.8222 |
0.4006 | 1.57 | 800 | 0.4244 | 0.8541 |
0.4484 | 1.77 | 900 | 0.4513 | 0.8566 |
0.5155 | 1.96 | 1000 | 0.4071 | 0.8649 |
0.518 | 2.16 | 1100 | 0.4155 | 0.8679 |
0.3762 | 2.36 | 1200 | 0.4152 | 0.8733 |
0.5409 | 2.55 | 1300 | 0.4038 | 0.8728 |
0.3184 | 2.75 | 1400 | 0.3683 | 0.8777 |
0.3861 | 2.95 | 1500 | 0.3675 | 0.8811 |
0.4824 | 3.14 | 1600 | 0.4404 | 0.8595 |
0.2793 | 3.34 | 1700 | 0.3696 | 0.8816 |
0.4095 | 3.54 | 1800 | 0.3102 | 0.8939 |
0.4151 | 3.73 | 1900 | 0.3558 | 0.8875 |
0.4036 | 3.93 | 2000 | 0.3215 | 0.8998 |
0.3547 | 4.13 | 2100 | 0.3511 | 0.8885 |
0.3071 | 4.32 | 2200 | 0.3376 | 0.8885 |
0.3448 | 4.52 | 2300 | 0.3807 | 0.8743 |
0.3574 | 4.72 | 2400 | 0.2826 | 0.9106 |
0.4435 | 4.91 | 2500 | 0.3275 | 0.9013 |
0.2811 | 5.11 | 2600 | 0.3285 | 0.9003 |
0.3514 | 5.3 | 2700 | 0.3562 | 0.8949 |
0.2323 | 5.5 | 2800 | 0.3023 | 0.9037 |
0.3736 | 5.7 | 2900 | 0.3012 | 0.8998 |
0.2659 | 5.89 | 3000 | 0.3243 | 0.8964 |
0.3934 | 6.09 | 3100 | 0.3007 | 0.9042 |
0.1951 | 6.29 | 3200 | 0.2643 | 0.9204 |
0.2882 | 6.48 | 3300 | 0.2816 | 0.9175 |
0.1887 | 6.68 | 3400 | 0.2669 | 0.9165 |
0.3612 | 6.88 | 3500 | 0.3215 | 0.8993 |
0.1423 | 7.07 | 3600 | 0.2684 | 0.9170 |
0.2935 | 7.27 | 3700 | 0.2826 | 0.9072 |
0.1549 | 7.47 | 3800 | 0.2783 | 0.9072 |
0.2678 | 7.66 | 3900 | 0.2535 | 0.9140 |
0.1954 | 7.86 | 4000 | 0.2578 | 0.9136 |
0.2319 | 8.06 | 4100 | 0.2595 | 0.9106 |
0.2016 | 8.25 | 4200 | 0.2671 | 0.9160 |
0.284 | 8.45 | 4300 | 0.2688 | 0.9136 |
0.1635 | 8.64 | 4400 | 0.3101 | 0.9111 |
0.2609 | 8.84 | 4500 | 0.2990 | 0.9145 |
0.1826 | 9.04 | 4600 | 0.2630 | 0.9077 |
0.2091 | 9.23 | 4700 | 0.2712 | 0.9180 |
0.1217 | 9.43 | 4800 | 0.2550 | 0.9126 |
0.198 | 9.63 | 4900 | 0.2648 | 0.9140 |
0.2123 | 9.82 | 5000 | 0.2819 | 0.9116 |
0.1399 | 10.02 | 5100 | 0.2690 | 0.9165 |
0.2429 | 10.22 | 5200 | 0.2685 | 0.9194 |
0.1376 | 10.41 | 5300 | 0.2930 | 0.9091 |
0.192 | 10.61 | 5400 | 0.3042 | 0.9101 |
0.1872 | 10.81 | 5500 | 0.2693 | 0.9160 |
0.1629 | 11.0 | 5600 | 0.2563 | 0.9185 |
0.2487 | 11.2 | 5700 | 0.2476 | 0.9258 |
0.242 | 11.39 | 5800 | 0.2407 | 0.9283 |
0.166 | 11.59 | 5900 | 0.2382 | 0.9317 |
0.1181 | 11.79 | 6000 | 0.2576 | 0.9140 |
0.1407 | 11.98 | 6100 | 0.2520 | 0.9268 |
0.1931 | 12.18 | 6200 | 0.2634 | 0.9204 |
0.1064 | 12.38 | 6300 | 0.2655 | 0.9219 |
0.1261 | 12.57 | 6400 | 0.2569 | 0.9209 |
0.1978 | 12.77 | 6500 | 0.2801 | 0.9131 |
0.2031 | 12.97 | 6600 | 0.2541 | 0.9190 |
0.1245 | 13.16 | 6700 | 0.2331 | 0.9249 |
0.2824 | 13.36 | 6800 | 0.2573 | 0.9199 |
0.1302 | 13.56 | 6900 | 0.2452 | 0.9219 |
0.0825 | 13.75 | 7000 | 0.2384 | 0.9258 |
0.1491 | 13.95 | 7100 | 0.2373 | 0.9303 |
0.1859 | 14.15 | 7200 | 0.2623 | 0.9253 |
0.2094 | 14.34 | 7300 | 0.2308 | 0.9303 |
0.14 | 14.54 | 7400 | 0.2377 | 0.9298 |
0.1836 | 14.73 | 7500 | 0.2389 | 0.9268 |
0.1347 | 14.93 | 7600 | 0.2205 | 0.9327 |
0.0747 | 15.13 | 7700 | 0.2375 | 0.9288 |
0.1448 | 15.32 | 7800 | 0.2277 | 0.9342 |
0.0885 | 15.52 | 7900 | 0.2560 | 0.9219 |
0.0975 | 15.72 | 8000 | 0.2082 | 0.9293 |
0.1185 | 15.91 | 8100 | 0.2561 | 0.9214 |
0.1544 | 16.11 | 8200 | 0.2599 | 0.9283 |
0.0959 | 16.31 | 8300 | 0.2418 | 0.9263 |
0.0835 | 16.5 | 8400 | 0.2521 | 0.9352 |
0.0846 | 16.7 | 8500 | 0.2258 | 0.9347 |
0.1255 | 16.9 | 8600 | 0.2170 | 0.9342 |
0.1116 | 17.09 | 8700 | 0.2462 | 0.9288 |
0.1331 | 17.29 | 8800 | 0.2123 | 0.9420 |
0.0895 | 17.49 | 8900 | 0.2513 | 0.9293 |
0.1628 | 17.68 | 9000 | 0.2223 | 0.9283 |
0.2152 | 17.88 | 9100 | 0.2144 | 0.9396 |
0.1074 | 18.07 | 9200 | 0.2295 | 0.9376 |
0.1888 | 18.27 | 9300 | 0.2557 | 0.9337 |
0.1014 | 18.47 | 9400 | 0.2007 | 0.9411 |
0.0341 | 18.66 | 9500 | 0.2289 | 0.9371 |
0.0365 | 18.86 | 9600 | 0.2434 | 0.9337 |
0.1099 | 19.06 | 9700 | 0.2222 | 0.9337 |
0.1303 | 19.25 | 9800 | 0.2208 | 0.9317 |
0.1209 | 19.45 | 9900 | 0.2151 | 0.9401 |
0.2119 | 19.65 | 10000 | 0.2209 | 0.9376 |
0.0734 | 19.84 | 10100 | 0.2242 | 0.9381 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3