<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
fine-tuned-IndoNLI-Basic-with-xlm-roberta-large-LR-1e-05
This model is a fine-tuned version of xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5019
- Accuracy: 0.8243
- F1: 0.8245
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
1.1396 | 0.5 | 40 | 1.0955 | 0.3696 | 0.2355 |
1.108 | 0.99 | 80 | 1.0433 | 0.4406 | 0.3665 |
1.0644 | 1.49 | 120 | 0.9406 | 0.5321 | 0.5293 |
0.963 | 1.98 | 160 | 0.9097 | 0.6154 | 0.6192 |
0.8825 | 2.48 | 200 | 0.7810 | 0.6891 | 0.6898 |
0.8825 | 2.97 | 240 | 0.7141 | 0.7196 | 0.7216 |
0.8145 | 3.47 | 280 | 0.7784 | 0.7219 | 0.7238 |
0.7253 | 3.96 | 320 | 0.6165 | 0.7711 | 0.7716 |
0.6706 | 4.46 | 360 | 0.6133 | 0.7597 | 0.7582 |
0.6356 | 4.95 | 400 | 0.5849 | 0.7833 | 0.7826 |
0.6356 | 5.45 | 440 | 0.5443 | 0.7979 | 0.7980 |
0.5919 | 5.94 | 480 | 0.5335 | 0.8093 | 0.8101 |
0.5509 | 6.44 | 520 | 0.5256 | 0.8157 | 0.8165 |
0.5286 | 6.93 | 560 | 0.5127 | 0.8107 | 0.8101 |
0.5081 | 7.43 | 600 | 0.5160 | 0.8170 | 0.8173 |
0.5081 | 7.93 | 640 | 0.5037 | 0.8220 | 0.8222 |
0.5077 | 8.42 | 680 | 0.4961 | 0.8207 | 0.8210 |
0.4829 | 8.92 | 720 | 0.5016 | 0.8266 | 0.8268 |
0.4585 | 9.41 | 760 | 0.5043 | 0.8229 | 0.8227 |
0.4712 | 9.91 | 800 | 0.5019 | 0.8243 | 0.8245 |
Framework versions
- Transformers 4.29.0.dev0
- Pytorch 1.13.1+cu117
- Datasets 2.2.0
- Tokenizers 0.13.2