<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
fine-tuned-IndoNLI-Basic-with-indobert-large-p2-LR-1e-05
This model is a fine-tuned version of indobenchmark/indobert-large-p2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6963
- Accuracy: 0.7724
- F1: 0.7724
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
1.3098 | 0.5 | 40 | 0.8899 | 0.6231 | 0.6268 |
1.0199 | 0.99 | 80 | 0.7268 | 0.6996 | 0.6999 |
0.767 | 1.49 | 120 | 0.6616 | 0.7406 | 0.7418 |
0.6649 | 1.98 | 160 | 0.6224 | 0.7547 | 0.7557 |
0.5796 | 2.48 | 200 | 0.6114 | 0.7656 | 0.7645 |
0.5796 | 2.97 | 240 | 0.6236 | 0.7524 | 0.7540 |
0.54 | 3.47 | 280 | 0.6223 | 0.7615 | 0.7624 |
0.4757 | 3.96 | 320 | 0.5965 | 0.7706 | 0.7721 |
0.4492 | 4.46 | 360 | 0.6216 | 0.7679 | 0.7681 |
0.3981 | 4.95 | 400 | 0.6347 | 0.7651 | 0.7669 |
0.3981 | 5.45 | 440 | 0.6373 | 0.7715 | 0.7727 |
0.352 | 5.94 | 480 | 0.6505 | 0.7674 | 0.7690 |
0.3294 | 6.44 | 520 | 0.6627 | 0.7720 | 0.7731 |
0.3058 | 6.93 | 560 | 0.6743 | 0.7660 | 0.7674 |
0.2692 | 7.43 | 600 | 0.6846 | 0.7665 | 0.7678 |
0.2692 | 7.93 | 640 | 0.6963 | 0.7724 | 0.7724 |
Framework versions
- Transformers 4.29.0.dev0
- Pytorch 1.13.1+cu117
- Datasets 2.2.0
- Tokenizers 0.13.2