<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
ViT_LFW_Model4
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1287
- Accuracy: 0.9705
- Precision: 0.9054
- Recall: 0.9583
- F1: 0.8838
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
3.4756 | 0.41 | 100 | 2.8779 | 0.6015 | 0.8461 | 0.3406 | 0.2698 |
2.6524 | 0.83 | 200 | 1.8112 | 0.7749 | 0.8298 | 0.5915 | 0.5064 |
1.6994 | 1.24 | 300 | 1.1829 | 0.8450 | 0.8065 | 0.7112 | 0.6160 |
1.3097 | 1.66 | 400 | 0.6849 | 0.9225 | 0.8808 | 0.8486 | 0.7908 |
0.5976 | 2.07 | 500 | 0.4778 | 0.9336 | 0.9015 | 0.8803 | 0.8293 |
0.412 | 2.49 | 600 | 0.4110 | 0.9299 | 0.8555 | 0.8988 | 0.8000 |
0.3165 | 2.9 | 700 | 0.3295 | 0.9262 | 0.8108 | 0.8787 | 0.7350 |
0.1537 | 3.32 | 800 | 0.2427 | 0.9520 | 0.8792 | 0.9333 | 0.8405 |
0.087 | 3.73 | 900 | 0.2373 | 0.9520 | 0.8989 | 0.9308 | 0.8562 |
0.0728 | 4.15 | 1000 | 0.2068 | 0.9483 | 0.8815 | 0.9264 | 0.8297 |
0.0305 | 4.56 | 1100 | 0.1759 | 0.9557 | 0.8692 | 0.9391 | 0.8279 |
0.0277 | 4.98 | 1200 | 0.1879 | 0.9446 | 0.8328 | 0.9197 | 0.7856 |
0.0126 | 5.39 | 1300 | 0.1759 | 0.9594 | 0.87 | 0.9333 | 0.8193 |
0.0137 | 5.81 | 1400 | 0.1595 | 0.9631 | 0.8771 | 0.9440 | 0.8396 |
0.0083 | 6.22 | 1500 | 0.1287 | 0.9705 | 0.9054 | 0.9583 | 0.8838 |
0.0078 | 6.64 | 1600 | 0.1295 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0064 | 7.05 | 1700 | 0.1322 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0062 | 7.47 | 1800 | 0.1299 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0053 | 7.88 | 1900 | 0.1307 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0049 | 8.3 | 2000 | 0.1295 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0041 | 8.71 | 2100 | 0.1302 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0036 | 9.13 | 2200 | 0.1310 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0037 | 9.54 | 2300 | 0.1311 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0028 | 9.96 | 2400 | 0.1301 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0031 | 10.37 | 2500 | 0.1308 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0026 | 10.79 | 2600 | 0.1304 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0023 | 11.2 | 2700 | 0.1299 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0024 | 11.62 | 2800 | 0.1315 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0022 | 12.03 | 2900 | 0.1321 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.002 | 12.45 | 3000 | 0.1321 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.002 | 12.86 | 3100 | 0.1332 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0017 | 13.28 | 3200 | 0.1327 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0016 | 13.69 | 3300 | 0.1328 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0015 | 14.11 | 3400 | 0.1336 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0015 | 14.52 | 3500 | 0.1343 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0015 | 14.94 | 3600 | 0.1345 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0014 | 15.35 | 3700 | 0.1344 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0013 | 15.77 | 3800 | 0.1354 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0013 | 16.18 | 3900 | 0.1357 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0012 | 16.6 | 4000 | 0.1365 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0011 | 17.01 | 4100 | 0.1357 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.001 | 17.43 | 4200 | 0.1361 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.001 | 17.84 | 4300 | 0.1364 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.001 | 18.26 | 4400 | 0.1379 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.001 | 18.67 | 4500 | 0.1375 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0009 | 19.09 | 4600 | 0.1374 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0009 | 19.5 | 4700 | 0.1374 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0009 | 19.92 | 4800 | 0.1382 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0008 | 20.33 | 4900 | 0.1385 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0007 | 20.75 | 5000 | 0.1389 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0007 | 21.16 | 5100 | 0.1391 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0007 | 21.58 | 5200 | 0.1392 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0007 | 21.99 | 5300 | 0.1397 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0007 | 22.41 | 5400 | 0.1401 | 0.9668 | 0.8910 | 0.9511 | 0.8592 |
0.0007 | 22.82 | 5500 | 0.1404 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0006 | 23.24 | 5600 | 0.1404 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0006 | 23.65 | 5700 | 0.1402 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0006 | 24.07 | 5800 | 0.1411 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0006 | 24.48 | 5900 | 0.1411 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0006 | 24.9 | 6000 | 0.1413 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 25.31 | 6100 | 0.1418 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0006 | 25.73 | 6200 | 0.1420 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 26.14 | 6300 | 0.1421 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 26.56 | 6400 | 0.1423 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 26.97 | 6500 | 0.1424 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0004 | 27.39 | 6600 | 0.1428 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 27.8 | 6700 | 0.1429 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 28.22 | 6800 | 0.1428 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 28.63 | 6900 | 0.1430 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 29.05 | 7000 | 0.1430 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 29.46 | 7100 | 0.1430 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
0.0005 | 29.88 | 7200 | 0.1430 | 0.9705 | 0.8963 | 0.9550 | 0.8666 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3