<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
Bert_class_1e-06
This model is a fine-tuned version of guoluo/Bert_1.5e_07 on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.2359
- Train Accuracy: 0.9271
- Validation Loss: 0.9369
- Validation Accuracy: 0.7394
- Train Lr: 9.938033e-07
- Epoch: 111
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 9.938033e-07, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Train Lr | Epoch |
---|---|---|---|---|---|
1.2823 | 0.4776 | 1.0993 | 0.6761 | 1e-06 | 0 |
1.0339 | 0.6776 | 0.9839 | 0.6761 | 9.99999e-07 | 1 |
0.9705 | 0.6776 | 0.9658 | 0.6761 | 9.999969e-07 | 2 |
0.9486 | 0.6776 | 0.9590 | 0.6761 | 9.99994e-07 | 3 |
0.9369 | 0.6776 | 0.9544 | 0.6761 | 9.9999e-07 | 4 |
0.9332 | 0.6776 | 0.9470 | 0.6761 | 9.99985e-07 | 5 |
0.9205 | 0.6776 | 0.9421 | 0.6761 | 9.99979e-07 | 6 |
0.9135 | 0.6776 | 0.9374 | 0.6761 | 9.999719e-07 | 7 |
0.9113 | 0.6776 | 0.9340 | 0.6761 | 9.99964e-07 | 8 |
0.9005 | 0.6776 | 0.9294 | 0.6761 | 9.99955e-07 | 9 |
0.8896 | 0.6776 | 0.9242 | 0.6761 | 9.99945e-07 | 10 |
0.8746 | 0.6800 | 0.9191 | 0.6761 | 9.99934e-07 | 11 |
0.8649 | 0.6824 | 0.9143 | 0.6761 | 9.999219e-07 | 12 |
0.8621 | 0.6847 | 0.9095 | 0.6761 | 9.999089e-07 | 13 |
0.8506 | 0.6847 | 0.9019 | 0.6761 | 9.99895e-07 | 14 |
0.8434 | 0.6800 | 0.8943 | 0.6761 | 9.9988e-07 | 15 |
0.8286 | 0.6871 | 0.8885 | 0.6761 | 9.998639e-07 | 16 |
0.8239 | 0.6824 | 0.8814 | 0.6761 | 9.998469e-07 | 17 |
0.8181 | 0.6894 | 0.8785 | 0.6761 | 9.998289e-07 | 18 |
0.7962 | 0.6894 | 0.8731 | 0.6690 | 9.998099e-07 | 19 |
0.7908 | 0.7012 | 0.8671 | 0.6690 | 9.997899e-07 | 20 |
0.7640 | 0.6988 | 0.8641 | 0.6761 | 9.997689e-07 | 21 |
0.7644 | 0.7035 | 0.8590 | 0.6831 | 9.997469e-07 | 22 |
0.7512 | 0.7200 | 0.8558 | 0.6831 | 9.99724e-07 | 23 |
0.7394 | 0.7200 | 0.8527 | 0.6972 | 9.997e-07 | 24 |
0.7366 | 0.7271 | 0.8501 | 0.7113 | 9.99675e-07 | 25 |
0.7293 | 0.7247 | 0.8471 | 0.7042 | 9.996489e-07 | 26 |
0.7189 | 0.7529 | 0.8479 | 0.7113 | 9.99622e-07 | 27 |
0.7077 | 0.7341 | 0.8411 | 0.7183 | 9.99594e-07 | 28 |
0.6965 | 0.7671 | 0.8409 | 0.7183 | 9.99565e-07 | 29 |
0.6838 | 0.7482 | 0.8372 | 0.7113 | 9.99535e-07 | 30 |
0.6835 | 0.7506 | 0.8362 | 0.7113 | 9.99504e-07 | 31 |
0.6702 | 0.7812 | 0.8365 | 0.6901 | 9.99472e-07 | 32 |
0.6623 | 0.7812 | 0.8323 | 0.7113 | 9.994391e-07 | 33 |
0.6565 | 0.7553 | 0.8298 | 0.6972 | 9.994051e-07 | 34 |
0.6452 | 0.7718 | 0.8291 | 0.6901 | 9.993701e-07 | 35 |
0.6396 | 0.7718 | 0.8284 | 0.7113 | 9.993341e-07 | 36 |
0.6299 | 0.7765 | 0.8262 | 0.6831 | 9.992972e-07 | 37 |
0.6230 | 0.7953 | 0.8364 | 0.7113 | 9.992592e-07 | 38 |
0.6095 | 0.7741 | 0.8233 | 0.7113 | 9.992202e-07 | 39 |
0.6193 | 0.7718 | 0.8206 | 0.7113 | 9.991802e-07 | 40 |
0.6008 | 0.7859 | 0.8260 | 0.7254 | 9.991393e-07 | 41 |
0.5967 | 0.7859 | 0.8199 | 0.7254 | 9.990973e-07 | 42 |
0.5883 | 0.7835 | 0.8189 | 0.7183 | 9.990544e-07 | 43 |
0.5751 | 0.8071 | 0.8279 | 0.7324 | 9.990104e-07 | 44 |
0.5709 | 0.8000 | 0.8204 | 0.7324 | 9.989654e-07 | 45 |
0.5697 | 0.8047 | 0.8229 | 0.7254 | 9.989195e-07 | 46 |
0.5580 | 0.8094 | 0.8152 | 0.7254 | 9.988726e-07 | 47 |
0.5595 | 0.8071 | 0.8275 | 0.7324 | 9.988246e-07 | 48 |
0.5486 | 0.7929 | 0.8168 | 0.7324 | 9.987757e-07 | 49 |
0.5400 | 0.8094 | 0.8239 | 0.7254 | 9.987258e-07 | 50 |
0.5352 | 0.8071 | 0.8190 | 0.7183 | 9.986749e-07 | 51 |
0.5141 | 0.8235 | 0.8171 | 0.7183 | 9.986229e-07 | 52 |
0.5324 | 0.8024 | 0.8191 | 0.7183 | 9.985699e-07 | 53 |
0.5123 | 0.8024 | 0.8279 | 0.7254 | 9.98516e-07 | 54 |
0.5151 | 0.8165 | 0.8213 | 0.7113 | 9.984611e-07 | 55 |
0.4986 | 0.8118 | 0.8176 | 0.7183 | 9.984052e-07 | 56 |
0.4925 | 0.8259 | 0.8208 | 0.7113 | 9.983482e-07 | 57 |
0.4848 | 0.8188 | 0.8182 | 0.7042 | 9.982904e-07 | 58 |
0.4952 | 0.8282 | 0.8214 | 0.7113 | 9.982315e-07 | 59 |
0.4837 | 0.8329 | 0.8192 | 0.7113 | 9.981716e-07 | 60 |
0.4513 | 0.8518 | 0.8224 | 0.7183 | 9.981106e-07 | 61 |
0.4628 | 0.8376 | 0.8227 | 0.7183 | 9.980488e-07 | 62 |
0.4633 | 0.8447 | 0.8246 | 0.7183 | 9.979859e-07 | 63 |
0.4472 | 0.8447 | 0.8256 | 0.7113 | 9.97922e-07 | 64 |
0.4529 | 0.8306 | 0.8285 | 0.7183 | 9.978571e-07 | 65 |
0.4579 | 0.8329 | 0.8331 | 0.7042 | 9.977913e-07 | 66 |
0.4326 | 0.8376 | 0.8278 | 0.7113 | 9.977244e-07 | 67 |
0.4255 | 0.8447 | 0.8265 | 0.7113 | 9.976566e-07 | 68 |
0.4322 | 0.8494 | 0.8293 | 0.7042 | 9.975878e-07 | 69 |
0.4189 | 0.8424 | 0.8382 | 0.7042 | 9.97518e-07 | 70 |
0.4236 | 0.8494 | 0.8302 | 0.7113 | 9.974472e-07 | 71 |
0.4025 | 0.8494 | 0.8364 | 0.7042 | 9.973753e-07 | 72 |
0.4225 | 0.8659 | 0.8370 | 0.7113 | 9.973025e-07 | 73 |
0.4027 | 0.8541 | 0.8377 | 0.7042 | 9.972288e-07 | 74 |
0.4090 | 0.8588 | 0.8381 | 0.7113 | 9.97154e-07 | 75 |
0.3887 | 0.8682 | 0.8378 | 0.7042 | 9.970781e-07 | 76 |
0.4022 | 0.8706 | 0.8406 | 0.7042 | 9.970014e-07 | 77 |
0.3867 | 0.8682 | 0.8457 | 0.7113 | 9.969236e-07 | 78 |
0.3689 | 0.8706 | 0.8460 | 0.7113 | 9.968448e-07 | 79 |
0.3728 | 0.8729 | 0.8527 | 0.7042 | 9.967652e-07 | 80 |
0.3754 | 0.8706 | 0.8525 | 0.7042 | 9.966844e-07 | 81 |
0.3580 | 0.8871 | 0.8531 | 0.7113 | 9.966027e-07 | 82 |
0.3718 | 0.8659 | 0.8593 | 0.7042 | 9.965199e-07 | 83 |
0.3535 | 0.8800 | 0.8593 | 0.7324 | 9.964363e-07 | 84 |
0.3342 | 0.8824 | 0.8704 | 0.6972 | 9.963516e-07 | 85 |
0.3341 | 0.8918 | 0.8630 | 0.7324 | 9.962658e-07 | 86 |
0.3371 | 0.8776 | 0.8698 | 0.7042 | 9.961792e-07 | 87 |
0.3338 | 0.8847 | 0.8689 | 0.7042 | 9.960916e-07 | 88 |
0.3295 | 0.8776 | 0.8753 | 0.6972 | 9.960029e-07 | 89 |
0.3259 | 0.8847 | 0.8696 | 0.7183 | 9.959133e-07 | 90 |
0.3290 | 0.8776 | 0.8726 | 0.7183 | 9.958227e-07 | 91 |
0.3117 | 0.8988 | 0.8798 | 0.7324 | 9.95731e-07 | 92 |
0.3075 | 0.8965 | 0.8836 | 0.7254 | 9.956385e-07 | 93 |
0.2905 | 0.9129 | 0.8868 | 0.7183 | 9.95545e-07 | 94 |
0.2979 | 0.9153 | 0.8888 | 0.7183 | 9.954504e-07 | 95 |
0.3031 | 0.8800 | 0.8956 | 0.7324 | 9.953548e-07 | 96 |
0.2883 | 0.9035 | 0.8984 | 0.7042 | 9.952582e-07 | 97 |
0.2835 | 0.9106 | 0.8969 | 0.7254 | 9.951607e-07 | 98 |
0.2803 | 0.9059 | 0.8998 | 0.7254 | 9.950621e-07 | 99 |
0.2812 | 0.9176 | 0.9034 | 0.7254 | 9.949626e-07 | 100 |
0.2714 | 0.9153 | 0.9028 | 0.7183 | 9.948621e-07 | 101 |
0.2905 | 0.9059 | 0.9144 | 0.7254 | 9.947606e-07 | 102 |
0.2631 | 0.9224 | 0.9143 | 0.6972 | 9.946582e-07 | 103 |
0.2679 | 0.9176 | 0.9180 | 0.7254 | 9.945547e-07 | 104 |
0.2583 | 0.9224 | 0.9206 | 0.7042 | 9.944504e-07 | 105 |
0.2613 | 0.9200 | 0.9286 | 0.7254 | 9.94345e-07 | 106 |
0.2669 | 0.9012 | 0.9237 | 0.7254 | 9.942386e-07 | 107 |
0.2571 | 0.9153 | 0.9351 | 0.7254 | 9.941313e-07 | 108 |
0.2570 | 0.9106 | 0.9306 | 0.7324 | 9.940229e-07 | 109 |
0.2344 | 0.9200 | 0.9396 | 0.7183 | 9.939135e-07 | 110 |
0.2359 | 0.9271 | 0.9369 | 0.7394 | 9.938033e-07 | 111 |
Framework versions
- Transformers 4.30.0.dev0
- TensorFlow 2.9.1
- Datasets 2.8.0
- Tokenizers 0.13.2