<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
TSE_roBERTa_5E
This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2671
- Accuracy: 0.9533
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.68 | 0.06 | 50 | 0.5879 | 0.9133 |
0.3596 | 0.12 | 100 | 0.3471 | 0.9 |
0.3019 | 0.17 | 150 | 0.2314 | 0.9333 |
0.2724 | 0.23 | 200 | 0.1860 | 0.9533 |
0.2641 | 0.29 | 250 | 0.2271 | 0.94 |
0.2941 | 0.35 | 300 | 0.1763 | 0.9467 |
0.2494 | 0.4 | 350 | 0.2019 | 0.94 |
0.221 | 0.46 | 400 | 0.2450 | 0.9533 |
0.2456 | 0.52 | 450 | 0.2298 | 0.9467 |
0.1705 | 0.58 | 500 | 0.2139 | 0.9533 |
0.1973 | 0.63 | 550 | 0.2810 | 0.9333 |
0.2348 | 0.69 | 600 | 0.2539 | 0.94 |
0.2561 | 0.75 | 650 | 0.2330 | 0.9333 |
0.2166 | 0.81 | 700 | 0.2083 | 0.9467 |
0.205 | 0.87 | 750 | 0.2768 | 0.92 |
0.2182 | 0.92 | 800 | 0.2182 | 0.94 |
0.2009 | 0.98 | 850 | 0.2534 | 0.94 |
0.1388 | 1.04 | 900 | 0.3099 | 0.9267 |
0.1208 | 1.1 | 950 | 0.2770 | 0.9467 |
0.1795 | 1.15 | 1000 | 0.2078 | 0.9467 |
0.1443 | 1.21 | 1050 | 0.1965 | 0.96 |
0.1519 | 1.27 | 1100 | 0.1918 | 0.9533 |
0.1653 | 1.33 | 1150 | 0.1850 | 0.96 |
0.1689 | 1.38 | 1200 | 0.2261 | 0.9467 |
0.1802 | 1.44 | 1250 | 0.2246 | 0.96 |
0.1894 | 1.5 | 1300 | 0.2026 | 0.96 |
0.219 | 1.56 | 1350 | 0.1598 | 0.96 |
0.1608 | 1.61 | 1400 | 0.1571 | 0.96 |
0.1976 | 1.67 | 1450 | 0.1699 | 0.9533 |
0.1987 | 1.73 | 1500 | 0.2173 | 0.9533 |
0.1503 | 1.79 | 1550 | 0.2097 | 0.9533 |
0.1293 | 1.85 | 1600 | 0.2316 | 0.9533 |
0.2267 | 1.9 | 1650 | 0.1664 | 0.9533 |
0.1833 | 1.96 | 1700 | 0.1829 | 0.9533 |
0.1991 | 2.02 | 1750 | 0.1854 | 0.96 |
0.0965 | 2.08 | 1800 | 0.2719 | 0.94 |
0.1869 | 2.13 | 1850 | 0.1759 | 0.9667 |
0.154 | 2.19 | 1900 | 0.2418 | 0.9533 |
0.1093 | 2.25 | 1950 | 0.2517 | 0.9533 |
0.1829 | 2.31 | 2000 | 0.2011 | 0.9667 |
0.1331 | 2.36 | 2050 | 0.2125 | 0.9667 |
0.1211 | 2.42 | 2100 | 0.2759 | 0.9533 |
0.1523 | 2.48 | 2150 | 0.2093 | 0.9533 |
0.1224 | 2.54 | 2200 | 0.2132 | 0.96 |
0.1205 | 2.6 | 2250 | 0.2117 | 0.96 |
0.1068 | 2.65 | 2300 | 0.2024 | 0.9667 |
0.1563 | 2.71 | 2350 | 0.1979 | 0.9533 |
0.1064 | 2.77 | 2400 | 0.2397 | 0.9533 |
0.1393 | 2.83 | 2450 | 0.2133 | 0.9533 |
0.0999 | 2.88 | 2500 | 0.2248 | 0.9533 |
0.1383 | 2.94 | 2550 | 0.2273 | 0.9467 |
0.1315 | 3.0 | 2600 | 0.2289 | 0.9467 |
0.095 | 3.06 | 2650 | 0.2668 | 0.9467 |
0.1249 | 3.11 | 2700 | 0.2345 | 0.96 |
0.0653 | 3.17 | 2750 | 0.2188 | 0.96 |
0.1102 | 3.23 | 2800 | 0.2601 | 0.9533 |
0.1118 | 3.29 | 2850 | 0.2241 | 0.9667 |
0.0746 | 3.34 | 2900 | 0.2306 | 0.96 |
0.0875 | 3.4 | 2950 | 0.2906 | 0.9467 |
0.0943 | 3.46 | 3000 | 0.2528 | 0.96 |
0.1253 | 3.52 | 3050 | 0.2503 | 0.9533 |
0.0971 | 3.58 | 3100 | 0.2182 | 0.96 |
0.0919 | 3.63 | 3150 | 0.2224 | 0.96 |
0.1053 | 3.69 | 3200 | 0.2114 | 0.9667 |
0.1041 | 3.75 | 3250 | 0.2055 | 0.9667 |
0.0836 | 3.81 | 3300 | 0.2196 | 0.96 |
0.0873 | 3.86 | 3350 | 0.2129 | 0.96 |
0.0725 | 3.92 | 3400 | 0.2352 | 0.9533 |
0.1187 | 3.98 | 3450 | 0.2114 | 0.96 |
0.108 | 4.04 | 3500 | 0.2233 | 0.96 |
0.0725 | 4.09 | 3550 | 0.2538 | 0.9533 |
0.0856 | 4.15 | 3600 | 0.2433 | 0.9533 |
0.0921 | 4.21 | 3650 | 0.2316 | 0.9533 |
0.0561 | 4.27 | 3700 | 0.2548 | 0.9533 |
0.0774 | 4.33 | 3750 | 0.2247 | 0.96 |
0.0508 | 4.38 | 3800 | 0.2389 | 0.96 |
0.1014 | 4.44 | 3850 | 0.2755 | 0.9533 |
0.0598 | 4.5 | 3900 | 0.2750 | 0.9533 |
0.0796 | 4.56 | 3950 | 0.2697 | 0.9533 |
0.0718 | 4.61 | 4000 | 0.2648 | 0.9533 |
0.0566 | 4.67 | 4050 | 0.2620 | 0.9533 |
0.0704 | 4.73 | 4100 | 0.2516 | 0.9533 |
0.0582 | 4.79 | 4150 | 0.2653 | 0.9533 |
0.1066 | 4.84 | 4200 | 0.2722 | 0.9467 |
0.0782 | 4.9 | 4250 | 0.2698 | 0.9533 |
0.0318 | 4.96 | 4300 | 0.2671 | 0.9533 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.13.0
- Datasets 2.3.2
- Tokenizers 0.13.1