generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

best_model-yelp_polarity-64-21

This model is a fine-tuned version of albert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 4 0.6653 0.9297
No log 2.0 8 0.6599 0.9375
0.3506 3.0 12 0.6517 0.9375
0.3506 4.0 16 0.6448 0.9375
0.4992 5.0 20 0.6507 0.9375
0.4992 6.0 24 0.6967 0.9219
0.4992 7.0 28 0.7602 0.9141
0.3039 8.0 32 0.9351 0.8984
0.3039 9.0 36 0.9244 0.8984
0.2241 10.0 40 0.7974 0.9062
0.2241 11.0 44 0.7229 0.9219
0.2241 12.0 48 0.6981 0.9219
0.1025 13.0 52 0.6961 0.9219
0.1025 14.0 56 0.6819 0.9219
0.1057 15.0 60 0.6655 0.9219
0.1057 16.0 64 0.6463 0.9219
0.1057 17.0 68 0.6240 0.9219
0.0733 18.0 72 0.6086 0.9141
0.0733 19.0 76 0.6109 0.9141
0.0366 20.0 80 0.6219 0.9141
0.0366 21.0 84 0.6291 0.9141
0.0366 22.0 88 0.6227 0.9219
0.0449 23.0 92 0.6182 0.9219
0.0449 24.0 96 0.6148 0.9219
0.0188 25.0 100 0.5999 0.9219
0.0188 26.0 104 0.5537 0.9297
0.0188 27.0 108 0.5538 0.9297
0.0146 28.0 112 0.5492 0.9297
0.0146 29.0 116 0.5275 0.9297
0.0131 30.0 120 0.5212 0.9219
0.0131 31.0 124 0.5486 0.9219
0.0131 32.0 128 0.5641 0.9141
0.0074 33.0 132 0.5489 0.9219
0.0074 34.0 136 0.5426 0.9219
0.0042 35.0 140 0.5468 0.9141
0.0042 36.0 144 0.5411 0.9141
0.0042 37.0 148 0.5366 0.9219
0.0027 38.0 152 0.5306 0.9219
0.0027 39.0 156 0.5182 0.9219
0.0011 40.0 160 0.5096 0.9219
0.0011 41.0 164 0.5059 0.9219
0.0011 42.0 168 0.5130 0.9219
0.0007 43.0 172 0.5198 0.9219
0.0007 44.0 176 0.5172 0.9219
0.0007 45.0 180 0.5129 0.9219
0.0007 46.0 184 0.5337 0.9062
0.0007 47.0 188 0.5600 0.9141
0.0003 48.0 192 0.5687 0.9141
0.0003 49.0 196 0.5413 0.9141
0.0003 50.0 200 0.5270 0.9062
0.0003 51.0 204 0.5249 0.9141
0.0003 52.0 208 0.5315 0.9141
0.0002 53.0 212 0.5528 0.9141
0.0002 54.0 216 0.5732 0.9141
0.0001 55.0 220 0.5812 0.9141
0.0001 56.0 224 0.5871 0.9141
0.0001 57.0 228 0.5854 0.9141
0.0001 58.0 232 0.5846 0.9141
0.0001 59.0 236 0.5842 0.9141
0.0 60.0 240 0.5865 0.9141
0.0 61.0 244 0.5895 0.9141
0.0 62.0 248 0.5908 0.9141
0.0001 63.0 252 0.5911 0.9141
0.0001 64.0 256 0.5905 0.9141
0.0 65.0 260 0.5870 0.9141
0.0 66.0 264 0.5859 0.9141
0.0 67.0 268 0.5863 0.9141
0.0 68.0 272 0.5881 0.9141
0.0 69.0 276 0.5888 0.9141
0.0 70.0 280 0.5902 0.9141
0.0 71.0 284 0.5926 0.9141
0.0 72.0 288 0.5945 0.9141
0.0 73.0 292 0.5949 0.9141
0.0 74.0 296 0.5962 0.9141
0.0 75.0 300 0.5982 0.9141
0.0 76.0 304 0.6003 0.9141
0.0 77.0 308 0.6014 0.9141
0.0 78.0 312 0.6018 0.9219
0.0 79.0 316 0.6024 0.9219
0.0 80.0 320 0.6037 0.9219
0.0 81.0 324 0.6041 0.9219
0.0 82.0 328 0.6052 0.9219
0.0 83.0 332 0.6064 0.9219
0.0 84.0 336 0.6069 0.9219
0.0 85.0 340 0.6069 0.9219
0.0 86.0 344 0.6074 0.9219
0.0 87.0 348 0.6089 0.9219
0.0 88.0 352 0.6098 0.9219
0.0 89.0 356 0.6098 0.9219
0.0 90.0 360 0.6100 0.9219
0.0 91.0 364 0.6098 0.9219
0.0 92.0 368 0.6098 0.9219
0.0 93.0 372 0.6101 0.9219
0.0 94.0 376 0.6111 0.9219
0.0 95.0 380 0.6122 0.9219
0.0 96.0 384 0.6131 0.9219
0.0 97.0 388 0.6122 0.9219
0.0 98.0 392 0.6127 0.9219
0.0 99.0 396 0.6124 0.9219
0.0 100.0 400 0.6120 0.9219
0.0 101.0 404 0.6127 0.9219
0.0 102.0 408 0.6132 0.9219
0.0 103.0 412 0.6140 0.9219
0.0 104.0 416 0.6150 0.9219
0.0 105.0 420 0.6158 0.9219
0.0 106.0 424 0.6160 0.9219
0.0 107.0 428 0.6161 0.9219
0.0 108.0 432 0.6166 0.9219
0.0 109.0 436 0.6168 0.9219
0.0 110.0 440 0.6170 0.9219
0.0 111.0 444 0.6178 0.9219
0.0 112.0 448 0.6184 0.9219
0.0 113.0 452 0.6189 0.9219
0.0 114.0 456 0.6197 0.9219
0.0 115.0 460 0.6213 0.9219
0.0 116.0 464 0.6220 0.9219
0.0 117.0 468 0.6226 0.9219
0.0 118.0 472 0.6229 0.9219
0.0 119.0 476 0.6235 0.9219
0.0 120.0 480 0.6219 0.9219
0.0 121.0 484 0.6219 0.9219
0.0 122.0 488 0.6223 0.9219
0.0 123.0 492 0.6236 0.9219
0.0 124.0 496 0.6246 0.9219
0.0 125.0 500 0.6259 0.9219
0.0 126.0 504 0.6265 0.9219
0.0 127.0 508 0.6270 0.9219
0.0 128.0 512 0.6272 0.9219
0.0 129.0 516 0.6271 0.9219
0.0 130.0 520 0.6262 0.9219
0.0 131.0 524 0.6257 0.9219
0.0 132.0 528 0.6255 0.9219
0.0 133.0 532 0.6258 0.9219
0.0 134.0 536 0.6262 0.9219
0.0 135.0 540 0.6272 0.9219
0.0 136.0 544 0.6277 0.9219
0.0 137.0 548 0.6286 0.9219
0.0 138.0 552 0.6288 0.9219
0.0 139.0 556 0.6292 0.9219
0.0 140.0 560 0.6295 0.9219
0.0 141.0 564 0.6293 0.9219
0.0 142.0 568 0.6294 0.9219
0.0 143.0 572 0.6296 0.9219
0.0 144.0 576 0.6299 0.9219
0.0 145.0 580 0.6297 0.9219
0.0 146.0 584 0.6299 0.9219
0.0 147.0 588 0.6300 0.9219
0.0 148.0 592 0.6300 0.9219
0.0 149.0 596 0.6300 0.9219
0.0 150.0 600 0.6300 0.9219

Framework versions