<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
vit-base_tobacco
This model is a fine-tuned version of jordyvl/vit-base_tobacco on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7324
- Accuracy: 0.8
- Brier Loss: 0.3049
- Nll: 1.3070
- F1 Micro: 0.8000
- F1 Macro: 0.7733
- Ece: 0.2124
- Aurc: 0.0840
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 40
- eval_batch_size: 40
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 640
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 0.8 | 1 | 0.7434 | 0.815 | 0.3073 | 1.1863 | 0.815 | 0.7942 | 0.2217 | 0.0720 |
No log | 1.6 | 2 | 0.7569 | 0.81 | 0.3117 | 1.2131 | 0.81 | 0.7893 | 0.2153 | 0.0800 |
No log | 2.4 | 3 | 0.7491 | 0.82 | 0.3107 | 1.2631 | 0.82 | 0.8063 | 0.2311 | 0.0777 |
No log | 4.0 | 5 | 0.7489 | 0.795 | 0.3088 | 1.1544 | 0.795 | 0.7766 | 0.2427 | 0.0730 |
No log | 4.8 | 6 | 0.7658 | 0.81 | 0.3171 | 1.3766 | 0.81 | 0.7983 | 0.2434 | 0.0822 |
No log | 5.6 | 7 | 0.7496 | 0.815 | 0.3097 | 1.1920 | 0.815 | 0.8014 | 0.2434 | 0.0848 |
No log | 6.4 | 8 | 0.7468 | 0.8 | 0.3090 | 1.0732 | 0.8000 | 0.7750 | 0.2195 | 0.0774 |
No log | 8.0 | 10 | 0.7563 | 0.815 | 0.3131 | 1.3472 | 0.815 | 0.8082 | 0.2255 | 0.0741 |
No log | 8.8 | 11 | 0.7548 | 0.81 | 0.3116 | 1.2016 | 0.81 | 0.7930 | 0.2496 | 0.0868 |
No log | 9.6 | 12 | 0.7395 | 0.805 | 0.3071 | 1.1664 | 0.805 | 0.7841 | 0.2432 | 0.0772 |
No log | 10.4 | 13 | 0.7296 | 0.82 | 0.3018 | 1.1776 | 0.82 | 0.8078 | 0.2214 | 0.0676 |
No log | 12.0 | 15 | 0.7515 | 0.815 | 0.3104 | 1.2034 | 0.815 | 0.7987 | 0.2307 | 0.0835 |
No log | 12.8 | 16 | 0.7350 | 0.81 | 0.3053 | 1.1762 | 0.81 | 0.7978 | 0.2196 | 0.0747 |
No log | 13.6 | 17 | 0.7281 | 0.805 | 0.3023 | 1.1664 | 0.805 | 0.7841 | 0.2144 | 0.0718 |
No log | 14.4 | 18 | 0.7395 | 0.81 | 0.3064 | 1.1750 | 0.81 | 0.7871 | 0.2306 | 0.0778 |
No log | 16.0 | 20 | 0.7427 | 0.81 | 0.3076 | 1.2637 | 0.81 | 0.7986 | 0.2194 | 0.0808 |
No log | 16.8 | 21 | 0.7337 | 0.81 | 0.3044 | 1.2447 | 0.81 | 0.7948 | 0.2321 | 0.0743 |
No log | 17.6 | 22 | 0.7340 | 0.805 | 0.3050 | 1.1681 | 0.805 | 0.7841 | 0.2307 | 0.0743 |
No log | 18.4 | 23 | 0.7338 | 0.805 | 0.3047 | 1.1708 | 0.805 | 0.7841 | 0.2290 | 0.0759 |
No log | 20.0 | 25 | 0.7390 | 0.815 | 0.3058 | 1.2551 | 0.815 | 0.7984 | 0.2489 | 0.0818 |
No log | 20.8 | 26 | 0.7390 | 0.815 | 0.3063 | 1.1894 | 0.815 | 0.7984 | 0.2294 | 0.0818 |
No log | 21.6 | 27 | 0.7349 | 0.805 | 0.3054 | 1.1714 | 0.805 | 0.7847 | 0.2011 | 0.0791 |
No log | 22.4 | 28 | 0.7308 | 0.81 | 0.3037 | 1.1694 | 0.81 | 0.7948 | 0.2128 | 0.0766 |
No log | 24.0 | 30 | 0.7353 | 0.81 | 0.3051 | 1.1852 | 0.81 | 0.7956 | 0.2282 | 0.0794 |
No log | 24.8 | 31 | 0.7378 | 0.81 | 0.3062 | 1.1870 | 0.81 | 0.7956 | 0.2293 | 0.0819 |
No log | 25.6 | 32 | 0.7356 | 0.81 | 0.3054 | 1.1863 | 0.81 | 0.7956 | 0.2287 | 0.0817 |
No log | 26.4 | 33 | 0.7309 | 0.81 | 0.3037 | 1.1801 | 0.81 | 0.7954 | 0.2209 | 0.0795 |
No log | 28.0 | 35 | 0.7336 | 0.805 | 0.3050 | 1.1733 | 0.805 | 0.7850 | 0.2082 | 0.0789 |
No log | 28.8 | 36 | 0.7334 | 0.81 | 0.3045 | 1.1799 | 0.81 | 0.7956 | 0.2207 | 0.0797 |
No log | 29.6 | 37 | 0.7320 | 0.81 | 0.3040 | 1.2447 | 0.81 | 0.7956 | 0.2279 | 0.0804 |
No log | 30.4 | 38 | 0.7328 | 0.81 | 0.3045 | 1.2473 | 0.81 | 0.7956 | 0.2154 | 0.0812 |
No log | 32.0 | 40 | 0.7322 | 0.805 | 0.3044 | 1.1796 | 0.805 | 0.7850 | 0.2384 | 0.0804 |
No log | 32.8 | 41 | 0.7318 | 0.81 | 0.3045 | 1.1792 | 0.81 | 0.7954 | 0.2291 | 0.0794 |
No log | 33.6 | 42 | 0.7302 | 0.81 | 0.3034 | 1.2401 | 0.81 | 0.7954 | 0.2086 | 0.0794 |
No log | 34.4 | 43 | 0.7311 | 0.805 | 0.3036 | 1.2424 | 0.805 | 0.7850 | 0.2278 | 0.0804 |
No log | 36.0 | 45 | 0.7323 | 0.805 | 0.3043 | 1.1902 | 0.805 | 0.7850 | 0.2119 | 0.0816 |
No log | 36.8 | 46 | 0.7304 | 0.805 | 0.3034 | 1.2428 | 0.805 | 0.7850 | 0.2330 | 0.0807 |
No log | 37.6 | 47 | 0.7297 | 0.805 | 0.3032 | 1.2413 | 0.805 | 0.7850 | 0.2447 | 0.0801 |
No log | 38.4 | 48 | 0.7310 | 0.805 | 0.3039 | 1.2424 | 0.805 | 0.7850 | 0.2233 | 0.0802 |
No log | 40.0 | 50 | 0.7316 | 0.805 | 0.3040 | 1.2451 | 0.805 | 0.7850 | 0.2094 | 0.0809 |
No log | 40.8 | 51 | 0.7313 | 0.805 | 0.3041 | 1.2450 | 0.805 | 0.7850 | 0.2093 | 0.0810 |
No log | 41.6 | 52 | 0.7313 | 0.805 | 0.3041 | 1.2445 | 0.805 | 0.7850 | 0.2073 | 0.0814 |
No log | 42.4 | 53 | 0.7315 | 0.805 | 0.3040 | 1.2447 | 0.805 | 0.7850 | 0.2198 | 0.0821 |
No log | 44.0 | 55 | 0.7303 | 0.805 | 0.3034 | 1.2441 | 0.805 | 0.7850 | 0.2048 | 0.0813 |
No log | 44.8 | 56 | 0.7306 | 0.805 | 0.3038 | 1.2444 | 0.805 | 0.7850 | 0.1966 | 0.0809 |
No log | 45.6 | 57 | 0.7317 | 0.805 | 0.3043 | 1.2449 | 0.805 | 0.7850 | 0.1976 | 0.0821 |
No log | 46.4 | 58 | 0.7317 | 0.805 | 0.3041 | 1.2466 | 0.805 | 0.7850 | 0.2007 | 0.0822 |
No log | 48.0 | 60 | 0.7316 | 0.805 | 0.3041 | 1.2499 | 0.805 | 0.7850 | 0.2137 | 0.0820 |
No log | 48.8 | 61 | 0.7320 | 0.8 | 0.3043 | 1.2536 | 0.8000 | 0.7733 | 0.2081 | 0.0822 |
No log | 49.6 | 62 | 0.7319 | 0.805 | 0.3044 | 1.2494 | 0.805 | 0.7850 | 0.1998 | 0.0825 |
No log | 50.4 | 63 | 0.7326 | 0.805 | 0.3048 | 1.2476 | 0.805 | 0.7850 | 0.1936 | 0.0828 |
No log | 52.0 | 65 | 0.7313 | 0.8 | 0.3044 | 1.2495 | 0.8000 | 0.7733 | 0.2117 | 0.0822 |
No log | 52.8 | 66 | 0.7304 | 0.8 | 0.3039 | 1.2524 | 0.8000 | 0.7733 | 0.2009 | 0.0818 |
No log | 53.6 | 67 | 0.7306 | 0.8 | 0.3038 | 1.2505 | 0.8000 | 0.7733 | 0.2182 | 0.0818 |
No log | 54.4 | 68 | 0.7321 | 0.8 | 0.3044 | 1.2513 | 0.8000 | 0.7733 | 0.2185 | 0.0833 |
No log | 56.0 | 70 | 0.7326 | 0.8 | 0.3049 | 1.2519 | 0.8000 | 0.7733 | 0.2014 | 0.0833 |
No log | 56.8 | 71 | 0.7320 | 0.8 | 0.3047 | 1.2580 | 0.8000 | 0.7733 | 0.2175 | 0.0829 |
No log | 57.6 | 72 | 0.7313 | 0.8 | 0.3043 | 1.2571 | 0.8000 | 0.7733 | 0.2045 | 0.0828 |
No log | 58.4 | 73 | 0.7314 | 0.8 | 0.3043 | 1.3065 | 0.8000 | 0.7733 | 0.2038 | 0.0827 |
No log | 60.0 | 75 | 0.7322 | 0.8 | 0.3046 | 1.3081 | 0.8000 | 0.7733 | 0.2047 | 0.0840 |
No log | 60.8 | 76 | 0.7323 | 0.8 | 0.3047 | 1.3078 | 0.8000 | 0.7733 | 0.2053 | 0.0839 |
No log | 61.6 | 77 | 0.7322 | 0.8 | 0.3047 | 1.3070 | 0.8000 | 0.7733 | 0.2051 | 0.0837 |
No log | 62.4 | 78 | 0.7316 | 0.8 | 0.3045 | 1.3062 | 0.8000 | 0.7733 | 0.2145 | 0.0835 |
No log | 64.0 | 80 | 0.7315 | 0.8 | 0.3044 | 1.3063 | 0.8000 | 0.7733 | 0.2067 | 0.0836 |
No log | 64.8 | 81 | 0.7320 | 0.8 | 0.3047 | 1.3064 | 0.8000 | 0.7733 | 0.2041 | 0.0839 |
No log | 65.6 | 82 | 0.7323 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2046 | 0.0839 |
No log | 66.4 | 83 | 0.7323 | 0.8 | 0.3048 | 1.3068 | 0.8000 | 0.7733 | 0.2045 | 0.0838 |
No log | 68.0 | 85 | 0.7320 | 0.8 | 0.3046 | 1.3068 | 0.8000 | 0.7733 | 0.2046 | 0.0840 |
No log | 68.8 | 86 | 0.7318 | 0.8 | 0.3045 | 1.3069 | 0.8000 | 0.7733 | 0.2114 | 0.0838 |
No log | 69.6 | 87 | 0.7316 | 0.8 | 0.3045 | 1.3066 | 0.8000 | 0.7733 | 0.2149 | 0.0836 |
No log | 70.4 | 88 | 0.7316 | 0.8 | 0.3045 | 1.3066 | 0.8000 | 0.7733 | 0.2244 | 0.0834 |
No log | 72.0 | 90 | 0.7321 | 0.8 | 0.3047 | 1.3069 | 0.8000 | 0.7733 | 0.2151 | 0.0837 |
No log | 72.8 | 91 | 0.7322 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2151 | 0.0839 |
No log | 73.6 | 92 | 0.7322 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2155 | 0.0840 |
No log | 74.4 | 93 | 0.7323 | 0.8 | 0.3048 | 1.3071 | 0.8000 | 0.7733 | 0.2129 | 0.0842 |
No log | 76.0 | 95 | 0.7324 | 0.8 | 0.3049 | 1.3071 | 0.8000 | 0.7733 | 0.2084 | 0.0841 |
No log | 76.8 | 96 | 0.7324 | 0.8 | 0.3049 | 1.3071 | 0.8000 | 0.7733 | 0.2141 | 0.0842 |
No log | 77.6 | 97 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2136 | 0.0841 |
No log | 78.4 | 98 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2136 | 0.0841 |
No log | 80.0 | 100 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2124 | 0.0840 |
Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3