<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
llama2_xs_233m_GQA-llama-1028-interleaved-deduped-v1-tb-interleaved-deduped-1028-0919
This model is a fine-tuned version of amazingvince/llama2_xs_233m_GQA-llama-1028-interleaved-deduped-v1-tb-interleaved-deduped-1028-0919 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.1626
- Accuracy: 0.4030
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 16
- seed: 17404
- distributed_type: multi-GPU
- num_devices: 2
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.95) and epsilon=1e-06
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 2.0
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
3.2561 | 0.09 | 250 | 3.4413 | 0.3719 |
3.2492 | 0.18 | 500 | 3.4013 | 0.3763 |
3.1923 | 0.27 | 750 | 3.3739 | 0.3789 |
3.1947 | 0.36 | 1000 | 3.3508 | 0.3817 |
3.2014 | 0.45 | 1250 | 3.3310 | 0.3837 |
3.187 | 0.54 | 1500 | 3.3098 | 0.3859 |
3.1083 | 0.63 | 1750 | 3.2901 | 0.3879 |
3.0937 | 0.72 | 2000 | 3.2718 | 0.3900 |
3.0772 | 0.81 | 2250 | 3.2543 | 0.3920 |
3.0102 | 0.9 | 2500 | 3.2394 | 0.3935 |
3.0455 | 0.98 | 2750 | 3.2249 | 0.3955 |
3.0091 | 1.07 | 3000 | 3.2157 | 0.3965 |
2.9399 | 1.16 | 3250 | 3.2067 | 0.3975 |
2.9885 | 1.25 | 3500 | 3.1967 | 0.3987 |
2.9979 | 1.34 | 3750 | 3.1885 | 0.4002 |
2.9405 | 1.43 | 4000 | 3.1823 | 0.4002 |
2.9643 | 1.52 | 4250 | 3.1754 | 0.4012 |
2.9571 | 1.61 | 4500 | 3.1702 | 0.4020 |
2.9677 | 1.7 | 4750 | 3.1668 | 0.4024 |
2.9564 | 1.79 | 5000 | 3.1640 | 0.4028 |
2.932 | 1.88 | 5250 | 3.1629 | 0.4030 |
2.9532 | 1.97 | 5500 | 3.1626 | 0.4030 |
Framework versions
- Transformers 4.34.0.dev0
- Pytorch 2.2.0.dev20230906+cu121
- Datasets 2.14.5
- Tokenizers 0.13.3