generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

finetuning-bert-sentiment-reviews-2

This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.01 10 0.6716 0.7463 0.2849
No log 0.03 20 0.5789 0.7463 0.2849
No log 0.04 30 0.4971 0.7788 0.3849
No log 0.06 40 0.4298 0.8672 0.5506
No log 0.07 50 0.3837 0.8794 0.5686
No log 0.09 60 0.3481 0.8802 0.5672
No log 0.1 70 0.3680 0.8757 0.5604
No log 0.12 80 0.3259 0.8854 0.5736
No log 0.13 90 0.3179 0.8854 0.5727
No log 0.15 100 0.3306 0.8891 0.6295
No log 0.16 110 0.3253 0.8894 0.6692
No log 0.18 120 0.3041 0.9024 0.7285
No log 0.19 130 0.2997 0.9068 0.7426
No log 0.21 140 0.2881 0.9057 0.7434
No log 0.22 150 0.2892 0.9094 0.7587
No log 0.24 160 0.2771 0.9149 0.7801
No log 0.25 170 0.2779 0.9135 0.7782
No log 0.27 180 0.2992 0.9109 0.7720
No log 0.28 190 0.2809 0.9083 0.7622
No log 0.3 200 0.2636 0.9146 0.7680
No log 0.31 210 0.3381 0.9079 0.7694
No log 0.33 220 0.2661 0.9197 0.7858
No log 0.34 230 0.3377 0.8854 0.7582
No log 0.36 240 0.2614 0.9190 0.7881
No log 0.37 250 0.2459 0.9264 0.7981
No log 0.38 260 0.2490 0.9246 0.7934
No log 0.4 270 0.2475 0.9197 0.7876
No log 0.41 280 0.2648 0.9161 0.7840
No log 0.43 290 0.2533 0.9249 0.8010
No log 0.44 300 0.2446 0.9234 0.8067
No log 0.46 310 0.2271 0.9260 0.8114
No log 0.47 320 0.2219 0.9246 0.8211
No log 0.49 330 0.2269 0.9320 0.8306
No log 0.5 340 0.2276 0.9264 0.8219
No log 0.52 350 0.2835 0.9201 0.7994
No log 0.53 360 0.2787 0.9231 0.8029
No log 0.55 370 0.2317 0.9301 0.8275
No log 0.56 380 0.2502 0.9131 0.8076
No log 0.58 390 0.2254 0.9294 0.8321
No log 0.59 400 0.2066 0.9312 0.8215
No log 0.61 410 0.2013 0.9342 0.8391
No log 0.62 420 0.2295 0.9260 0.8279
No log 0.64 430 0.2100 0.9338 0.8428
No log 0.65 440 0.2129 0.9316 0.8297
No log 0.67 450 0.2135 0.9327 0.8203
No log 0.68 460 0.2681 0.9212 0.8028
No log 0.7 470 0.2178 0.9320 0.8312
No log 0.71 480 0.1999 0.9342 0.8321
No log 0.72 490 0.2172 0.9305 0.8334
0.2988 0.74 500 0.2086 0.9308 0.8368
0.2988 0.75 510 0.2052 0.9342 0.8430
0.2988 0.77 520 0.2111 0.9331 0.8333
0.2988 0.78 530 0.2279 0.9327 0.8250
0.2988 0.8 540 0.2361 0.9271 0.8164

Framework versions