<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
BERT-QA-b8
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 6.2383
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
No log | 1.0 | 432 | 6.2383 |
6.2405 | 2.0 | 865 | 6.2383 |
6.2389 | 3.0 | 1297 | 6.2383 |
6.2385 | 4.0 | 1730 | 6.2383 |
6.2384 | 5.0 | 2162 | 6.2383 |
6.2385 | 6.0 | 2595 | 6.2383 |
6.2384 | 7.0 | 3027 | 6.2383 |
6.2384 | 8.0 | 3460 | 6.2383 |
6.2384 | 9.0 | 3892 | 6.2383 |
6.2384 | 10.0 | 4325 | 6.2383 |
6.2384 | 11.0 | 4757 | 6.2383 |
6.2384 | 12.0 | 5190 | 6.2383 |
6.2384 | 13.0 | 5622 | 6.2383 |
6.2384 | 14.0 | 6055 | 6.2383 |
6.2384 | 15.0 | 6487 | 6.2383 |
6.2384 | 16.0 | 6920 | 6.2383 |
6.2384 | 17.0 | 7352 | 6.2383 |
6.2384 | 18.0 | 7785 | 6.2383 |
6.2384 | 19.0 | 8217 | 6.2383 |
6.2385 | 20.0 | 8650 | 6.2383 |
6.2385 | 21.0 | 9082 | 6.2383 |
6.2384 | 22.0 | 9515 | 6.2383 |
6.2384 | 23.0 | 9947 | 6.2383 |
6.2384 | 24.0 | 10380 | 6.2383 |
6.2385 | 25.0 | 10812 | 6.2383 |
6.2384 | 26.0 | 11245 | 6.2383 |
6.2384 | 27.0 | 11677 | 6.2383 |
6.2384 | 28.0 | 12110 | 6.2383 |
6.2384 | 29.0 | 12542 | 6.2383 |
6.2384 | 30.0 | 12975 | 6.2383 |
6.2384 | 31.0 | 13407 | 6.2383 |
6.2384 | 32.0 | 13840 | 6.2383 |
6.2384 | 33.0 | 14272 | 6.2383 |
6.2384 | 34.0 | 14705 | 6.2383 |
6.2384 | 35.0 | 15137 | 6.2383 |
6.2383 | 36.0 | 15570 | 6.2383 |
6.2384 | 37.0 | 16002 | 6.2383 |
6.2384 | 38.0 | 16435 | 6.2383 |
6.2384 | 39.0 | 16867 | 6.2383 |
6.2384 | 40.0 | 17300 | 6.2383 |
6.2384 | 41.0 | 17732 | 6.2383 |
6.2383 | 42.0 | 18165 | 6.2383 |
6.2383 | 43.0 | 18597 | 6.2383 |
6.2384 | 44.0 | 19030 | 6.2383 |
6.2384 | 45.0 | 19462 | 6.2383 |
6.2383 | 46.0 | 19895 | 6.2383 |
6.2383 | 47.0 | 20327 | 6.2383 |
6.2384 | 48.0 | 20760 | 6.2383 |
6.2383 | 49.0 | 21192 | 6.2383 |
6.2383 | 49.94 | 21600 | 6.2383 |
Framework versions
- Transformers 4.33.0
- Pytorch 2.0.0
- Datasets 2.5.2
- Tokenizers 0.13.3