generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

bert-base-uncased-sst-2-32-21

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 1.0499 0.8125
No log 2.0 4 1.0524 0.8125
No log 3.0 6 1.0539 0.8125
No log 4.0 8 1.0539 0.8125
0.5899 5.0 10 1.0528 0.8125
0.5899 6.0 12 1.0524 0.8125
0.5899 7.0 14 1.0483 0.8125
0.5899 8.0 16 1.0416 0.8125
0.5899 9.0 18 1.0345 0.8125
0.4445 10.0 20 1.0251 0.8125
0.4445 11.0 22 1.0193 0.8125
0.4445 12.0 24 1.0143 0.8125
0.4445 13.0 26 1.0063 0.8281
0.4445 14.0 28 1.0024 0.8281
0.3304 15.0 30 0.9919 0.8281
0.3304 16.0 32 0.9848 0.8281
0.3304 17.0 34 0.9719 0.8281
0.3304 18.0 36 0.9605 0.8281
0.3304 19.0 38 0.9521 0.8281
0.2653 20.0 40 0.9440 0.8438
0.2653 21.0 42 0.9354 0.8594
0.2653 22.0 44 0.9302 0.8594
0.2653 23.0 46 0.9207 0.8594
0.2653 24.0 48 0.9098 0.8438
0.1597 25.0 50 0.9013 0.8438
0.1597 26.0 52 0.8944 0.8438
0.1597 27.0 54 0.8902 0.8438
0.1597 28.0 56 0.8888 0.8438
0.1597 29.0 58 0.8933 0.8438
0.1221 30.0 60 0.9068 0.8438
0.1221 31.0 62 0.9171 0.8438
0.1221 32.0 64 0.9209 0.8438
0.1221 33.0 66 0.9141 0.8438
0.1221 34.0 68 0.9113 0.8438
0.0292 35.0 70 0.9097 0.8438
0.0292 36.0 72 0.9044 0.8438
0.0292 37.0 74 0.8901 0.8438
0.0292 38.0 76 0.8745 0.8438
0.0292 39.0 78 0.8628 0.8594
0.0277 40.0 80 0.8533 0.8594
0.0277 41.0 82 0.8514 0.8594
0.0277 42.0 84 0.8535 0.8594
0.0277 43.0 86 0.8577 0.8594
0.0277 44.0 88 0.8616 0.8594
0.0058 45.0 90 0.8620 0.8594
0.0058 46.0 92 0.8650 0.8594
0.0058 47.0 94 0.8639 0.8594
0.0058 48.0 96 0.8666 0.8594
0.0058 49.0 98 0.8689 0.8594
0.0029 50.0 100 0.8701 0.8594
0.0029 51.0 102 0.8718 0.8594
0.0029 52.0 104 0.8821 0.8438
0.0029 53.0 106 0.9051 0.8438
0.0029 54.0 108 0.9265 0.8438
0.0024 55.0 110 0.9354 0.8438
0.0024 56.0 112 0.9213 0.8438
0.0024 57.0 114 0.8944 0.8438
0.0024 58.0 116 0.8806 0.875
0.0024 59.0 118 0.8787 0.875
0.0017 60.0 120 0.8820 0.875
0.0017 61.0 122 0.8880 0.8594
0.0017 62.0 124 0.8931 0.8594
0.0017 63.0 126 0.8966 0.8594
0.0017 64.0 128 0.8997 0.8594
0.0011 65.0 130 0.9026 0.8594
0.0011 66.0 132 0.9050 0.8594
0.0011 67.0 134 0.9065 0.8594
0.0011 68.0 136 0.9028 0.8594
0.0011 69.0 138 0.8963 0.8594
0.0013 70.0 140 0.8943 0.875
0.0013 71.0 142 0.8951 0.875
0.0013 72.0 144 0.8971 0.875
0.0013 73.0 146 0.9001 0.875
0.0013 74.0 148 0.9047 0.875
0.0073 75.0 150 0.9117 0.875
0.0073 76.0 152 0.9202 0.8438
0.0073 77.0 154 0.9281 0.8438
0.0073 78.0 156 0.9347 0.8438
0.0073 79.0 158 0.9403 0.8438
0.0008 80.0 160 0.9438 0.8438
0.0008 81.0 162 0.9421 0.8438
0.0008 82.0 164 0.9401 0.8438
0.0008 83.0 166 0.9382 0.8438
0.0008 84.0 168 0.9363 0.8438
0.0008 85.0 170 0.9341 0.8438
0.0008 86.0 172 0.9313 0.8594
0.0008 87.0 174 0.9280 0.8594
0.0008 88.0 176 0.9261 0.875
0.0008 89.0 178 0.9251 0.875
0.0007 90.0 180 0.9246 0.875
0.0007 91.0 182 0.9245 0.875
0.0007 92.0 184 0.9251 0.875
0.0007 93.0 186 0.9260 0.875
0.0007 94.0 188 0.9272 0.875
0.0006 95.0 190 0.9285 0.875
0.0006 96.0 192 0.9292 0.875
0.0006 97.0 194 0.9301 0.875
0.0006 98.0 196 0.9310 0.875
0.0006 99.0 198 0.9332 0.875
0.0009 100.0 200 0.9397 0.8594
0.0009 101.0 202 0.9488 0.8594
0.0009 102.0 204 0.9563 0.8594
0.0009 103.0 206 0.9610 0.8594
0.0009 104.0 208 0.9633 0.8594
0.0006 105.0 210 0.9627 0.8594
0.0006 106.0 212 0.9614 0.8594
0.0006 107.0 214 0.9602 0.8594
0.0006 108.0 216 0.9592 0.8594
0.0006 109.0 218 0.9584 0.8594
0.0005 110.0 220 0.9580 0.8594
0.0005 111.0 222 0.9580 0.8594
0.0005 112.0 224 0.9585 0.875
0.0005 113.0 226 0.9596 0.875
0.0005 114.0 228 0.9605 0.875
0.0004 115.0 230 0.9614 0.875
0.0004 116.0 232 0.9621 0.875
0.0004 117.0 234 0.9635 0.875
0.0004 118.0 236 0.9650 0.875
0.0004 119.0 238 0.9665 0.875
0.0004 120.0 240 0.9681 0.875
0.0004 121.0 242 0.9697 0.875
0.0004 122.0 244 0.9713 0.875
0.0004 123.0 246 0.9729 0.875
0.0004 124.0 248 0.9745 0.875
0.0004 125.0 250 0.9761 0.875
0.0004 126.0 252 0.9777 0.875
0.0004 127.0 254 0.9794 0.875
0.0004 128.0 256 0.9810 0.875
0.0004 129.0 258 0.9827 0.875
0.0004 130.0 260 0.9844 0.875
0.0004 131.0 262 0.9861 0.875
0.0004 132.0 264 0.9877 0.875
0.0004 133.0 266 0.9893 0.875
0.0004 134.0 268 0.9909 0.875
0.0003 135.0 270 0.9925 0.875
0.0003 136.0 272 0.9941 0.875
0.0003 137.0 274 0.9957 0.875
0.0003 138.0 276 0.9973 0.875
0.0003 139.0 278 0.9989 0.875
0.0003 140.0 280 1.0004 0.875
0.0003 141.0 282 1.0018 0.875
0.0003 142.0 284 1.0033 0.875
0.0003 143.0 286 1.0047 0.875
0.0003 144.0 288 1.0062 0.875
0.0003 145.0 290 1.0076 0.875
0.0003 146.0 292 1.0090 0.875
0.0003 147.0 294 1.0105 0.875
0.0003 148.0 296 1.0118 0.875
0.0003 149.0 298 1.0132 0.875
0.0003 150.0 300 1.0145 0.875

Framework versions