<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
distilbert-base-uncased-finetuned-sst-2-english-finetuned-Multi_classification
This model is a fine-tuned version of distilbert-base-uncased-finetuned-sst-2-english on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.6930
- Accuracy: 0.2680
- Macro Averaged Precision: 0.2166
- Micro Averaged Precision: 0.2680
- Macro Averaged Recall: 0.2327
- Micro Averaged Recall: 0.2680
- Macro Averaged F1: 0.2116
- Micro Averaged F1: 0.2680
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Macro Averaged Precision | Micro Averaged Precision | Macro Averaged Recall | Micro Averaged Recall | Macro Averaged F1 | Micro Averaged F1 |
---|---|---|---|---|---|---|---|---|---|---|
1.7458 | 1.0 | 635 | 1.7202 | 0.25 | 0.3757 | 0.25 | 0.2110 | 0.25 | 0.1614 | 0.25 |
1.6782 | 2.0 | 1270 | 1.6930 | 0.2680 | 0.2166 | 0.2680 | 0.2327 | 0.2680 | 0.2116 | 0.2680 |
1.5839 | 3.0 | 1905 | 1.8221 | 0.2430 | 0.2825 | 0.2430 | 0.2197 | 0.2430 | 0.2010 | 0.2430 |
1.217 | 4.0 | 2540 | 1.9127 | 0.2594 | 0.2682 | 0.2594 | 0.2418 | 0.2594 | 0.2450 | 0.2594 |
0.9503 | 5.0 | 3175 | 2.3028 | 0.2430 | 0.2555 | 0.2430 | 0.2231 | 0.2430 | 0.2179 | 0.2430 |
0.7459 | 6.0 | 3810 | 2.6594 | 0.25 | 0.2742 | 0.25 | 0.2328 | 0.25 | 0.2312 | 0.25 |
0.5819 | 7.0 | 4445 | 3.0386 | 0.2469 | 0.2698 | 0.2469 | 0.2312 | 0.2469 | 0.2319 | 0.2469 |
0.3212 | 8.0 | 5080 | 3.3728 | 0.2547 | 0.2844 | 0.2547 | 0.2379 | 0.2547 | 0.2413 | 0.2547 |
0.2428 | 9.0 | 5715 | 3.9017 | 0.2398 | 0.2627 | 0.2398 | 0.2220 | 0.2398 | 0.2173 | 0.2398 |
0.1962 | 10.0 | 6350 | 4.2271 | 0.2492 | 0.2717 | 0.2492 | 0.2352 | 0.2492 | 0.2378 | 0.2492 |
0.1515 | 11.0 | 6985 | 4.6681 | 0.2484 | 0.2570 | 0.2484 | 0.2314 | 0.2484 | 0.2304 | 0.2484 |
0.1085 | 12.0 | 7620 | 5.3864 | 0.25 | 0.2756 | 0.25 | 0.2359 | 0.25 | 0.2372 | 0.25 |
0.0996 | 13.0 | 8255 | 5.6783 | 0.2516 | 0.2624 | 0.2516 | 0.2331 | 0.2516 | 0.2304 | 0.2516 |
0.1114 | 14.0 | 8890 | 5.9540 | 0.2477 | 0.2734 | 0.2477 | 0.2338 | 0.2477 | 0.2367 | 0.2477 |
0.073 | 15.0 | 9525 | 6.5480 | 0.2492 | 0.2575 | 0.2492 | 0.2311 | 0.2492 | 0.2238 | 0.2492 |
0.065 | 16.0 | 10160 | 6.7862 | 0.2602 | 0.2740 | 0.2602 | 0.2438 | 0.2602 | 0.2434 | 0.2602 |
0.0636 | 17.0 | 10795 | 7.0422 | 0.2578 | 0.2806 | 0.2578 | 0.2359 | 0.2578 | 0.2354 | 0.2578 |
0.0662 | 18.0 | 11430 | 7.2309 | 0.2641 | 0.2856 | 0.2641 | 0.2526 | 0.2641 | 0.2588 | 0.2641 |
0.0582 | 19.0 | 12065 | 7.5682 | 0.2617 | 0.2797 | 0.2617 | 0.2435 | 0.2617 | 0.2462 | 0.2617 |
0.0541 | 20.0 | 12700 | 7.8098 | 0.2531 | 0.2711 | 0.2531 | 0.2344 | 0.2531 | 0.2338 | 0.2531 |
0.0511 | 21.0 | 13335 | 8.0497 | 0.2508 | 0.2710 | 0.2508 | 0.2287 | 0.2508 | 0.2217 | 0.2508 |
0.0436 | 22.0 | 13970 | 8.3609 | 0.2469 | 0.2458 | 0.2469 | 0.2249 | 0.2469 | 0.2201 | 0.2469 |
0.0492 | 23.0 | 14605 | 8.4803 | 0.2547 | 0.2565 | 0.2547 | 0.2327 | 0.2547 | 0.2284 | 0.2547 |
0.03 | 24.0 | 15240 | 8.7246 | 0.2578 | 0.2793 | 0.2578 | 0.2353 | 0.2578 | 0.2318 | 0.2578 |
0.0322 | 25.0 | 15875 | 9.1533 | 0.2555 | 0.2607 | 0.2555 | 0.2277 | 0.2555 | 0.2192 | 0.2555 |
0.0468 | 26.0 | 16510 | 8.7821 | 0.2477 | 0.2717 | 0.2477 | 0.2242 | 0.2477 | 0.2202 | 0.2477 |
0.0327 | 27.0 | 17145 | 8.7916 | 0.2516 | 0.2609 | 0.2516 | 0.2307 | 0.2516 | 0.2296 | 0.2516 |
0.027 | 28.0 | 17780 | 9.0334 | 0.2477 | 0.2569 | 0.2477 | 0.2240 | 0.2477 | 0.2211 | 0.2477 |
0.0284 | 29.0 | 18415 | 9.1966 | 0.2492 | 0.2712 | 0.2492 | 0.2316 | 0.2492 | 0.2283 | 0.2492 |
0.0251 | 30.0 | 19050 | 9.3470 | 0.2594 | 0.2696 | 0.2594 | 0.2353 | 0.2594 | 0.2291 | 0.2594 |
0.0295 | 31.0 | 19685 | 9.5033 | 0.2562 | 0.2557 | 0.2562 | 0.2307 | 0.2562 | 0.2237 | 0.2562 |
0.0198 | 32.0 | 20320 | 9.5799 | 0.2609 | 0.2829 | 0.2609 | 0.2374 | 0.2609 | 0.2336 | 0.2609 |
0.0198 | 33.0 | 20955 | 9.4022 | 0.2562 | 0.2787 | 0.2562 | 0.2389 | 0.2562 | 0.2403 | 0.2562 |
0.0164 | 34.0 | 21590 | 9.8864 | 0.2547 | 0.2666 | 0.2547 | 0.2323 | 0.2547 | 0.2280 | 0.2547 |
0.0142 | 35.0 | 22225 | 9.6852 | 0.2602 | 0.2793 | 0.2602 | 0.2403 | 0.2602 | 0.2413 | 0.2602 |
0.011 | 36.0 | 22860 | 9.7627 | 0.2672 | 0.2866 | 0.2672 | 0.2431 | 0.2672 | 0.2412 | 0.2672 |
0.0126 | 37.0 | 23495 | 9.8309 | 0.2633 | 0.2717 | 0.2633 | 0.2415 | 0.2633 | 0.2395 | 0.2633 |
0.0102 | 38.0 | 24130 | 10.0910 | 0.2508 | 0.2561 | 0.2508 | 0.2257 | 0.2508 | 0.2196 | 0.2508 |
0.0117 | 39.0 | 24765 | 10.0140 | 0.2680 | 0.2749 | 0.2680 | 0.2455 | 0.2680 | 0.2417 | 0.2680 |
0.013 | 40.0 | 25400 | 9.9480 | 0.2687 | 0.2801 | 0.2687 | 0.2511 | 0.2687 | 0.2475 | 0.2687 |
0.0057 | 41.0 | 26035 | 10.0697 | 0.2602 | 0.2633 | 0.2602 | 0.2376 | 0.2602 | 0.2356 | 0.2602 |
0.0057 | 42.0 | 26670 | 10.0187 | 0.2555 | 0.2733 | 0.2555 | 0.2354 | 0.2555 | 0.2358 | 0.2555 |
0.0048 | 43.0 | 27305 | 10.1409 | 0.2578 | 0.2654 | 0.2578 | 0.2365 | 0.2578 | 0.2360 | 0.2578 |
0.0063 | 44.0 | 27940 | 10.1794 | 0.2578 | 0.2811 | 0.2578 | 0.2361 | 0.2578 | 0.2360 | 0.2578 |
0.0052 | 45.0 | 28575 | 10.3589 | 0.2508 | 0.2624 | 0.2508 | 0.2300 | 0.2508 | 0.2258 | 0.2508 |
0.0055 | 46.0 | 29210 | 10.2599 | 0.2672 | 0.2829 | 0.2672 | 0.2425 | 0.2672 | 0.2409 | 0.2672 |
0.0021 | 47.0 | 29845 | 10.3328 | 0.2656 | 0.2744 | 0.2656 | 0.2421 | 0.2656 | 0.2396 | 0.2656 |
0.0019 | 48.0 | 30480 | 10.3512 | 0.2586 | 0.2704 | 0.2586 | 0.2356 | 0.2586 | 0.2309 | 0.2586 |
0.004 | 49.0 | 31115 | 10.2833 | 0.2664 | 0.2725 | 0.2664 | 0.2432 | 0.2664 | 0.2409 | 0.2664 |
0.0014 | 50.0 | 31750 | 10.2776 | 0.2680 | 0.2765 | 0.2680 | 0.2444 | 0.2680 | 0.2423 | 0.2680 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.1+cu117
- Datasets 1.18.4
- Tokenizers 0.12.1