<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
Regression_bert_1500
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.2224
- Train Mae: 0.4296
- Train Mse: 0.2717
- Train R2-score: 0.8508
- Validation Loss: 0.1846
- Validation Mae: 0.4542
- Validation Mse: 0.2649
- Validation R2-score: 0.7458
- Epoch: 39
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 1e-04, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Train Mae | Train Mse | Train R2-score | Validation Loss | Validation Mae | Validation Mse | Validation R2-score | Epoch |
---|---|---|---|---|---|---|---|---|
0.2245 | 0.4557 | 0.2970 | 0.6980 | 0.1019 | 0.3869 | 0.1805 | 0.6789 | 0 |
0.1102 | 0.3741 | 0.1848 | 0.8480 | 0.0839 | 0.2912 | 0.1209 | 0.8311 | 1 |
0.0845 | 0.3575 | 0.1689 | 0.2888 | 0.0620 | 0.3294 | 0.1463 | 0.8421 | 2 |
0.0554 | 0.3231 | 0.1366 | 0.7118 | 0.0613 | 0.2950 | 0.1305 | 0.7960 | 3 |
0.0481 | 0.3071 | 0.1276 | 0.8265 | 0.0516 | 0.3112 | 0.1357 | 0.8145 | 4 |
0.0461 | 0.2876 | 0.1168 | 0.8131 | 0.0524 | 0.2770 | 0.1097 | 0.8434 | 5 |
0.0454 | 0.2809 | 0.1140 | 0.7029 | 0.0502 | 0.3041 | 0.1340 | 0.8063 | 6 |
0.0448 | 0.2930 | 0.1212 | 0.6336 | 0.0514 | 0.2761 | 0.1143 | 0.8154 | 7 |
0.0446 | 0.2728 | 0.1079 | 0.8086 | 0.0532 | 0.2696 | 0.1145 | 0.8033 | 8 |
0.0450 | 0.2733 | 0.1086 | 0.6564 | 0.0504 | 0.2590 | 0.1033 | 0.8335 | 9 |
0.0792 | 0.3267 | 0.1471 | 0.5533 | 0.0848 | 0.3585 | 0.1709 | 0.7364 | 10 |
0.0673 | 0.3235 | 0.1407 | 0.6299 | 0.0621 | 0.3478 | 0.1549 | 0.7915 | 11 |
0.0571 | 0.3151 | 0.1347 | 0.6762 | 0.0729 | 0.2784 | 0.1203 | 0.8483 | 12 |
0.0528 | 0.2797 | 0.1104 | 0.7393 | 0.0642 | 0.2901 | 0.1291 | 0.7939 | 13 |
0.0490 | 0.2930 | 0.1187 | 0.6853 | 0.0821 | 0.2995 | 0.1446 | 0.7879 | 14 |
0.0566 | 0.2789 | 0.1120 | 0.7938 | 0.0878 | 0.3242 | 0.1589 | 0.8498 | 15 |
0.1926 | 0.4228 | 0.2638 | 0.6635 | 0.2201 | 0.4528 | 0.2705 | 0.6836 | 16 |
0.1921 | 0.4231 | 0.2458 | 0.7452 | 0.1018 | 0.3940 | 0.1900 | 0.7454 | 17 |
0.1288 | 0.4126 | 0.2154 | 0.6279 | 0.1035 | 0.3943 | 0.1991 | 0.7491 | 18 |
0.1897 | 0.4375 | 0.2617 | 0.8354 | 0.2664 | 0.4703 | 0.3447 | 0.7632 | 19 |
0.2095 | 0.4337 | 0.2676 | 0.8187 | 0.2730 | 0.4652 | 0.3425 | 0.8093 | 20 |
0.2682 | 0.4555 | 0.3358 | 0.7789 | 0.2851 | 0.4590 | 0.3518 | 0.8013 | 21 |
0.2416 | 0.4616 | 0.3213 | 0.7117 | 0.2466 | 0.4999 | 0.3667 | 0.6966 | 22 |
0.1798 | 0.4569 | 0.2737 | 0.5086 | 0.0945 | 0.4097 | 0.2023 | 0.7475 | 23 |
0.1361 | 0.4186 | 0.2140 | 0.5820 | 0.1181 | 0.4135 | 0.2075 | 0.7951 | 24 |
0.1660 | 0.4334 | 0.2322 | 0.4041 | 0.2522 | 0.4329 | 0.3019 | 0.7785 | 25 |
0.3046 | 0.4603 | 0.3555 | 0.7796 | 0.2886 | 0.4894 | 0.3587 | 0.7229 | 26 |
0.2843 | 0.4723 | 0.3491 | 0.7267 | 0.2937 | 0.4712 | 0.3616 | 0.7308 | 27 |
0.2793 | 0.4531 | 0.3379 | 0.7893 | 0.2952 | 0.4762 | 0.3738 | 0.7540 | 28 |
0.2410 | 0.4286 | 0.2899 | 0.8189 | 0.2358 | 0.4379 | 0.2932 | 0.7727 | 29 |
0.2270 | 0.4203 | 0.2691 | 0.8224 | 0.2361 | 0.4466 | 0.2992 | 0.7742 | 30 |
0.2260 | 0.4341 | 0.2776 | 0.8716 | 0.2290 | 0.4345 | 0.2885 | 0.7809 | 31 |
0.2198 | 0.4225 | 0.2629 | 0.7666 | 0.2303 | 0.4392 | 0.2895 | 0.7775 | 32 |
0.2247 | 0.4245 | 0.2709 | 0.8193 | 0.2291 | 0.4424 | 0.2921 | 0.7813 | 33 |
0.2241 | 0.4276 | 0.2704 | 0.6859 | 0.2280 | 0.4430 | 0.2960 | 0.7879 | 34 |
0.2239 | 0.4292 | 0.2709 | 0.8308 | 0.2314 | 0.4445 | 0.2893 | 0.7641 | 35 |
0.2174 | 0.4251 | 0.2637 | 0.7179 | 0.2354 | 0.4565 | 0.3017 | 0.7566 | 36 |
0.2211 | 0.4257 | 0.2686 | 0.7896 | 0.2409 | 0.4467 | 0.3014 | 0.7683 | 37 |
0.2321 | 0.4218 | 0.2745 | 0.8021 | 0.2413 | 0.4525 | 0.3062 | 0.7663 | 38 |
0.2224 | 0.4296 | 0.2717 | 0.8508 | 0.1846 | 0.4542 | 0.2649 | 0.7458 | 39 |
Framework versions
- Transformers 4.28.1
- TensorFlow 2.12.0
- Datasets 2.11.0
- Tokenizers 0.13.3