<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
distilbert-base-uncased-finetuned-customer-reviews
This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2025
- Accuracy: 0.9282
- F1: 0.8858
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
0.6594 | 1.0 | 17 | 0.6247 | 0.6667 | 0.0 |
0.6259 | 2.0 | 34 | 0.6059 | 0.6667 | 0.0 |
0.5948 | 3.0 | 51 | 0.5691 | 0.6925 | 0.2074 |
0.5516 | 4.0 | 68 | 0.5241 | 0.7701 | 0.5238 |
0.5022 | 5.0 | 85 | 0.4813 | 0.7989 | 0.6111 |
0.4698 | 6.0 | 102 | 0.4496 | 0.8276 | 0.6629 |
0.4179 | 7.0 | 119 | 0.4137 | 0.8563 | 0.7423 |
0.3801 | 8.0 | 136 | 0.3833 | 0.8764 | 0.7817 |
0.3396 | 9.0 | 153 | 0.3537 | 0.9023 | 0.8317 |
0.3042 | 10.0 | 170 | 0.3256 | 0.9109 | 0.8517 |
0.2743 | 11.0 | 187 | 0.3058 | 0.9195 | 0.8692 |
0.2472 | 12.0 | 204 | 0.2905 | 0.9195 | 0.8692 |
0.2251 | 13.0 | 221 | 0.2780 | 0.9167 | 0.8638 |
0.1979 | 14.0 | 238 | 0.2659 | 0.9224 | 0.8744 |
0.1864 | 15.0 | 255 | 0.2573 | 0.9224 | 0.8756 |
0.158 | 16.0 | 272 | 0.2470 | 0.9253 | 0.8807 |
0.1493 | 17.0 | 289 | 0.2357 | 0.9282 | 0.8869 |
0.1404 | 18.0 | 306 | 0.2352 | 0.9253 | 0.8807 |
0.1364 | 19.0 | 323 | 0.2294 | 0.9282 | 0.8858 |
0.1256 | 20.0 | 340 | 0.2225 | 0.9253 | 0.8807 |
0.12 | 21.0 | 357 | 0.2160 | 0.9253 | 0.8807 |
0.1177 | 22.0 | 374 | 0.2136 | 0.9253 | 0.8807 |
0.1115 | 23.0 | 391 | 0.2119 | 0.9253 | 0.8807 |
0.1084 | 24.0 | 408 | 0.2090 | 0.9310 | 0.8899 |
0.1062 | 25.0 | 425 | 0.2097 | 0.9253 | 0.8807 |
0.1078 | 26.0 | 442 | 0.2037 | 0.9253 | 0.8807 |
0.102 | 27.0 | 459 | 0.2047 | 0.9253 | 0.8807 |
0.1016 | 28.0 | 476 | 0.2025 | 0.9224 | 0.8756 |
0.0971 | 29.0 | 493 | 0.2033 | 0.9253 | 0.8807 |
0.0985 | 30.0 | 510 | 0.2025 | 0.9282 | 0.8858 |
Framework versions
- Transformers 4.29.2
- Pytorch 2.0.1
- Datasets 2.13.1
- Tokenizers 0.13.3