<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
vit-huge-HAM-10000-sharpened-patch-14
This model is a fine-tuned version of google/vit-huge-patch14-224-in21k on the ahishamm/HAM_db_sharpened dataset. It achieves the following results on the evaluation set:
- Loss: 0.4411
- Accuracy: 0.8554
- Recall: 0.8554
- F1: 0.8554
- Precision: 0.8554
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Recall | F1 | Precision |
---|---|---|---|---|---|---|---|
0.6177 | 0.2 | 100 | 0.7082 | 0.7591 | 0.7591 | 0.7591 | 0.7591 |
0.6848 | 0.4 | 200 | 0.6570 | 0.7631 | 0.7631 | 0.7631 | 0.7631 |
0.622 | 0.6 | 300 | 0.5880 | 0.7920 | 0.7920 | 0.7920 | 0.7920 |
0.5887 | 0.8 | 400 | 0.5599 | 0.7965 | 0.7965 | 0.7965 | 0.7965 |
0.4812 | 1.0 | 500 | 0.5364 | 0.8010 | 0.8010 | 0.8010 | 0.8010 |
0.4013 | 1.2 | 600 | 0.4874 | 0.8249 | 0.8249 | 0.8249 | 0.8249 |
0.3987 | 1.4 | 700 | 0.4533 | 0.8354 | 0.8354 | 0.8354 | 0.8354 |
0.4118 | 1.6 | 800 | 0.4540 | 0.8424 | 0.8424 | 0.8424 | 0.8424 |
0.3272 | 1.8 | 900 | 0.4536 | 0.8254 | 0.8254 | 0.8254 | 0.8254 |
0.3318 | 2.0 | 1000 | 0.4411 | 0.8554 | 0.8554 | 0.8554 | 0.8554 |
0.0859 | 2.2 | 1100 | 0.4641 | 0.8519 | 0.8519 | 0.8519 | 0.8519 |
0.1026 | 2.4 | 1200 | 0.4692 | 0.8554 | 0.8554 | 0.8554 | 0.8554 |
0.0934 | 2.59 | 1300 | 0.4555 | 0.8474 | 0.8474 | 0.8474 | 0.8474 |
0.1084 | 2.79 | 1400 | 0.5017 | 0.8454 | 0.8454 | 0.8454 | 0.8454 |
0.0603 | 2.99 | 1500 | 0.4803 | 0.8599 | 0.8599 | 0.8599 | 0.8599 |
0.013 | 3.19 | 1600 | 0.4905 | 0.8633 | 0.8633 | 0.8633 | 0.8633 |
0.0585 | 3.39 | 1700 | 0.5305 | 0.8678 | 0.8678 | 0.8678 | 0.8678 |
0.0322 | 3.59 | 1800 | 0.5342 | 0.8648 | 0.8648 | 0.8648 | 0.8648 |
0.0086 | 3.79 | 1900 | 0.5134 | 0.8668 | 0.8668 | 0.8668 | 0.8668 |
0.0275 | 3.99 | 2000 | 0.5136 | 0.8693 | 0.8693 | 0.8693 | 0.8693 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3