generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

bert-base-uncased-sst-2-16-42

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 0.5342 0.6875
No log 2.0 2 0.5341 0.6875
No log 3.0 3 0.5340 0.6875
No log 4.0 4 0.5339 0.6875
No log 5.0 5 0.5337 0.6875
No log 6.0 6 0.5335 0.6875
No log 7.0 7 0.5332 0.6875
No log 8.0 8 0.5328 0.6875
No log 9.0 9 0.5325 0.6875
0.5587 10.0 10 0.5320 0.6875
0.5587 11.0 11 0.5314 0.6875
0.5587 12.0 12 0.5305 0.6875
0.5587 13.0 13 0.5296 0.6875
0.5587 14.0 14 0.5285 0.6875
0.5587 15.0 15 0.5274 0.6875
0.5587 16.0 16 0.5264 0.6875
0.5587 17.0 17 0.5254 0.6875
0.5587 18.0 18 0.5244 0.6875
0.5587 19.0 19 0.5235 0.6875
0.5357 20.0 20 0.5225 0.6875
0.5357 21.0 21 0.5216 0.6875
0.5357 22.0 22 0.5204 0.6875
0.5357 23.0 23 0.5192 0.6875
0.5357 24.0 24 0.5179 0.6875
0.5357 25.0 25 0.5165 0.6875
0.5357 26.0 26 0.5151 0.7188
0.5357 27.0 27 0.5135 0.7188
0.5357 28.0 28 0.5119 0.7188
0.5357 29.0 29 0.5102 0.7188
0.4826 30.0 30 0.5085 0.7188
0.4826 31.0 31 0.5064 0.7188
0.4826 32.0 32 0.5050 0.7188
0.4826 33.0 33 0.5036 0.7188
0.4826 34.0 34 0.5019 0.7188
0.4826 35.0 35 0.5002 0.7188
0.4826 36.0 36 0.4980 0.7188
0.4826 37.0 37 0.4958 0.7188
0.4826 38.0 38 0.4931 0.7188
0.4826 39.0 39 0.4900 0.7188
0.438 40.0 40 0.4866 0.75
0.438 41.0 41 0.4831 0.75
0.438 42.0 42 0.4802 0.75
0.438 43.0 43 0.4773 0.75
0.438 44.0 44 0.4746 0.75
0.438 45.0 45 0.4713 0.7812
0.438 46.0 46 0.4685 0.7812
0.438 47.0 47 0.4651 0.7812
0.438 48.0 48 0.4620 0.7812
0.438 49.0 49 0.4583 0.7812
0.367 50.0 50 0.4552 0.7812
0.367 51.0 51 0.4533 0.7812
0.367 52.0 52 0.4519 0.7812
0.367 53.0 53 0.4500 0.7812
0.367 54.0 54 0.4482 0.7812
0.367 55.0 55 0.4470 0.7812
0.367 56.0 56 0.4460 0.7812
0.367 57.0 57 0.4452 0.7812
0.367 58.0 58 0.4440 0.7812
0.367 59.0 59 0.4422 0.7812
0.2811 60.0 60 0.4401 0.7812
0.2811 61.0 61 0.4391 0.7812
0.2811 62.0 62 0.4370 0.7812
0.2811 63.0 63 0.4358 0.7812
0.2811 64.0 64 0.4342 0.7812
0.2811 65.0 65 0.4338 0.7812
0.2811 66.0 66 0.4339 0.7812
0.2811 67.0 67 0.4345 0.7812
0.2811 68.0 68 0.4339 0.7812
0.2811 69.0 69 0.4339 0.75
0.22 70.0 70 0.4347 0.7812
0.22 71.0 71 0.4342 0.7812
0.22 72.0 72 0.4338 0.7812
0.22 73.0 73 0.4335 0.7812
0.22 74.0 74 0.4322 0.7812
0.22 75.0 75 0.4296 0.7812
0.22 76.0 76 0.4266 0.8125
0.22 77.0 77 0.4230 0.8438
0.22 78.0 78 0.4199 0.8438
0.22 79.0 79 0.4170 0.8438
0.1839 80.0 80 0.4147 0.8438
0.1839 81.0 81 0.4131 0.8438
0.1839 82.0 82 0.4120 0.8438
0.1839 83.0 83 0.4102 0.8438
0.1839 84.0 84 0.4090 0.8438
0.1839 85.0 85 0.4073 0.8438
0.1839 86.0 86 0.4059 0.8438
0.1839 87.0 87 0.4049 0.8438
0.1839 88.0 88 0.4043 0.8438
0.1839 89.0 89 0.4044 0.8438
0.1385 90.0 90 0.4045 0.8438
0.1385 91.0 91 0.4049 0.8438
0.1385 92.0 92 0.4054 0.8438
0.1385 93.0 93 0.4059 0.8438
0.1385 94.0 94 0.4057 0.8125
0.1385 95.0 95 0.4066 0.8125
0.1385 96.0 96 0.4070 0.8125
0.1385 97.0 97 0.4072 0.8125
0.1385 98.0 98 0.4078 0.8125
0.1385 99.0 99 0.4081 0.8125
0.1178 100.0 100 0.4079 0.8125
0.1178 101.0 101 0.4083 0.8125
0.1178 102.0 102 0.4087 0.8125
0.1178 103.0 103 0.4101 0.8125
0.1178 104.0 104 0.4120 0.8125
0.1178 105.0 105 0.4132 0.8125
0.1178 106.0 106 0.4148 0.8125
0.1178 107.0 107 0.4163 0.8125
0.1178 108.0 108 0.4176 0.8125
0.1178 109.0 109 0.4207 0.8125
0.0987 110.0 110 0.4235 0.8125
0.0987 111.0 111 0.4253 0.8125
0.0987 112.0 112 0.4268 0.8125
0.0987 113.0 113 0.4273 0.8125
0.0987 114.0 114 0.4271 0.8125
0.0987 115.0 115 0.4270 0.8125
0.0987 116.0 116 0.4277 0.8125
0.0987 117.0 117 0.4279 0.8125
0.0987 118.0 118 0.4287 0.8125
0.0987 119.0 119 0.4264 0.8125
0.0788 120.0 120 0.4252 0.8125
0.0788 121.0 121 0.4234 0.8125
0.0788 122.0 122 0.4213 0.8125
0.0788 123.0 123 0.4184 0.8125
0.0788 124.0 124 0.4164 0.8125
0.0788 125.0 125 0.4148 0.8125
0.0788 126.0 126 0.4140 0.8125
0.0788 127.0 127 0.4142 0.8125
0.0788 128.0 128 0.4135 0.8125
0.0788 129.0 129 0.4132 0.8125
0.0612 130.0 130 0.4131 0.8125
0.0612 131.0 131 0.4136 0.8125
0.0612 132.0 132 0.4144 0.8125
0.0612 133.0 133 0.4148 0.8125
0.0612 134.0 134 0.4154 0.8125
0.0612 135.0 135 0.4167 0.8125
0.0612 136.0 136 0.4181 0.8125
0.0612 137.0 137 0.4199 0.8125
0.0612 138.0 138 0.4211 0.8125
0.0612 139.0 139 0.4222 0.8125
0.0466 140.0 140 0.4243 0.8125
0.0466 141.0 141 0.4256 0.8125
0.0466 142.0 142 0.4278 0.8125
0.0466 143.0 143 0.4280 0.8125
0.0466 144.0 144 0.4286 0.8125
0.0466 145.0 145 0.4294 0.8125
0.0466 146.0 146 0.4311 0.8125
0.0466 147.0 147 0.4332 0.8125
0.0466 148.0 148 0.4351 0.7812
0.0466 149.0 149 0.4371 0.7812
0.036 150.0 150 0.4392 0.7812

Framework versions