<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
distilbert-finetuning-unhealthyConv-dropout005-epochs-20
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3097
- Mse: 0.3097
- Rmse: 0.5565
- Mae: 0.1938
- R2: 0.9443
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Mse | Rmse | Mae | R2 |
---|---|---|---|---|---|---|---|
1.0552 | 1.0 | 6778 | 1.0304 | 1.0304 | 1.0151 | 0.6908 | 0.8146 |
0.7775 | 2.0 | 13556 | 0.8393 | 0.8393 | 0.9162 | 0.6069 | 0.8490 |
0.5806 | 3.0 | 20334 | 0.6831 | 0.6831 | 0.8265 | 0.5382 | 0.8771 |
0.4068 | 4.0 | 27112 | 0.5803 | 0.5803 | 0.7618 | 0.4616 | 0.8956 |
0.3122 | 5.0 | 33890 | 0.5134 | 0.5134 | 0.7165 | 0.4256 | 0.9077 |
0.2188 | 6.0 | 40668 | 0.4231 | 0.4231 | 0.6505 | 0.3486 | 0.9239 |
0.1637 | 7.0 | 47446 | 0.3956 | 0.3956 | 0.6289 | 0.3185 | 0.9288 |
0.1379 | 8.0 | 54224 | 0.3792 | 0.3792 | 0.6158 | 0.3185 | 0.9318 |
0.1052 | 9.0 | 61002 | 0.3598 | 0.3598 | 0.5999 | 0.2821 | 0.9353 |
0.088 | 10.0 | 67780 | 0.3550 | 0.3550 | 0.5958 | 0.2827 | 0.9361 |
0.0747 | 11.0 | 74558 | 0.3549 | 0.3549 | 0.5957 | 0.2897 | 0.9362 |
0.0603 | 12.0 | 81336 | 0.3376 | 0.3376 | 0.5811 | 0.2572 | 0.9393 |
0.056 | 13.0 | 88114 | 0.3351 | 0.3351 | 0.5789 | 0.2477 | 0.9397 |
0.043 | 14.0 | 94892 | 0.3277 | 0.3277 | 0.5725 | 0.2304 | 0.9411 |
0.0386 | 15.0 | 101670 | 0.3201 | 0.3201 | 0.5657 | 0.2277 | 0.9424 |
0.0354 | 16.0 | 108448 | 0.3167 | 0.3167 | 0.5628 | 0.2084 | 0.9430 |
0.0346 | 17.0 | 115226 | 0.3158 | 0.3158 | 0.5620 | 0.2069 | 0.9432 |
0.0284 | 18.0 | 122004 | 0.3109 | 0.3109 | 0.5576 | 0.1976 | 0.9441 |
0.0271 | 19.0 | 128782 | 0.3098 | 0.3098 | 0.5566 | 0.1937 | 0.9443 |
0.0245 | 20.0 | 135560 | 0.3097 | 0.3097 | 0.5565 | 0.1938 | 0.9443 |
Framework versions
- Transformers 4.32.1
- Pytorch 2.0.1+cu118
- Datasets 2.14.4
- Tokenizers 0.13.3