<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
roberta-base-finetuned-cv
This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1731
- Precision: 0.7668
- Recall: 0.8235
- F1: 0.7941
- Accuracy: 0.9699
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
No log | 1.0 | 17 | 0.4505 | 0.0 | 0.0 | 0.0 | 0.8958 |
No log | 2.0 | 34 | 0.4233 | 0.1936 | 0.3653 | 0.2530 | 0.8852 |
No log | 3.0 | 51 | 0.2362 | 0.4170 | 0.5294 | 0.4666 | 0.9394 |
No log | 4.0 | 68 | 0.1773 | 0.5681 | 0.6850 | 0.6211 | 0.9552 |
No log | 5.0 | 85 | 0.1570 | 0.7360 | 0.7486 | 0.7422 | 0.9635 |
No log | 6.0 | 102 | 0.1592 | 0.7308 | 0.7495 | 0.7400 | 0.9620 |
No log | 7.0 | 119 | 0.1560 | 0.6909 | 0.8378 | 0.7573 | 0.9637 |
No log | 8.0 | 136 | 0.1603 | 0.7520 | 0.7913 | 0.7712 | 0.9665 |
No log | 9.0 | 153 | 0.1453 | 0.7306 | 0.8207 | 0.7730 | 0.9654 |
No log | 10.0 | 170 | 0.1509 | 0.7469 | 0.8539 | 0.7968 | 0.9667 |
No log | 11.0 | 187 | 0.1541 | 0.7594 | 0.8235 | 0.7902 | 0.9686 |
No log | 12.0 | 204 | 0.1514 | 0.7585 | 0.8283 | 0.7918 | 0.9682 |
No log | 13.0 | 221 | 0.1583 | 0.7823 | 0.8216 | 0.8015 | 0.9690 |
No log | 14.0 | 238 | 0.1571 | 0.7519 | 0.8397 | 0.7934 | 0.9686 |
No log | 15.0 | 255 | 0.1590 | 0.7851 | 0.8283 | 0.8061 | 0.9702 |
No log | 16.0 | 272 | 0.1594 | 0.7598 | 0.8491 | 0.8020 | 0.9697 |
No log | 17.0 | 289 | 0.1757 | 0.7799 | 0.8169 | 0.7980 | 0.9698 |
No log | 18.0 | 306 | 0.1687 | 0.7768 | 0.8188 | 0.7972 | 0.9696 |
No log | 19.0 | 323 | 0.1706 | 0.7653 | 0.8321 | 0.7973 | 0.9700 |
No log | 20.0 | 340 | 0.1731 | 0.7668 | 0.8235 | 0.7941 | 0.9699 |
Framework versions
- Transformers 4.27.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2