<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
Longformer_v5
This model is a fine-tuned version of allenai/longformer-base-4096 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7919
- Precision: 0.8516
- Recall: 0.8678
- F1: 0.6520
- Accuracy: 0.8259
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 7
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
0.7744 | 1.0 | 1012 | 0.5785 | 0.8375 | 0.8501 | 0.5798 | 0.8098 |
0.5211 | 2.0 | 2024 | 0.5415 | 0.8434 | 0.8801 | 0.6251 | 0.8282 |
0.3996 | 3.0 | 3036 | 0.5565 | 0.8500 | 0.8766 | 0.6303 | 0.8274 |
0.2964 | 4.0 | 4048 | 0.6017 | 0.8617 | 0.8546 | 0.6415 | 0.8240 |
0.2187 | 5.0 | 5060 | 0.6660 | 0.8485 | 0.8718 | 0.6431 | 0.8271 |
0.1603 | 6.0 | 6072 | 0.7235 | 0.8493 | 0.8759 | 0.6544 | 0.8290 |
0.1208 | 7.0 | 7084 | 0.7919 | 0.8516 | 0.8678 | 0.6520 | 0.8259 |
Framework versions
- Transformers 4.18.0
- Pytorch 1.10.0+cu111
- Datasets 2.1.0
- Tokenizers 0.12.1