<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
t5-base-finetuned-jamendo-1-epochs
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.6106
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
4.7386 | 0.03 | 5000 | 5.0387 |
4.1861 | 0.05 | 10000 | 4.4039 |
4.2526 | 0.08 | 15000 | 4.0396 |
4.5082 | 0.11 | 20000 | 3.8230 |
4.5717 | 0.13 | 25000 | 3.6899 |
3.3498 | 0.16 | 30000 | 3.6018 |
3.7051 | 0.18 | 35000 | 3.5430 |
3.994 | 0.21 | 40000 | 3.4818 |
3.1388 | 0.24 | 45000 | 3.4297 |
3.1811 | 0.26 | 50000 | 3.3662 |
3.5069 | 0.29 | 55000 | 3.3302 |
3.6618 | 0.32 | 60000 | 3.3029 |
3.1444 | 0.34 | 65000 | 3.2941 |
3.2546 | 0.37 | 70000 | 3.2828 |
3.2194 | 0.39 | 75000 | 3.2815 |
3.1319 | 0.42 | 80000 | 3.2951 |
3.0055 | 0.45 | 85000 | 3.3464 |
3.8339 | 0.47 | 90000 | 3.3839 |
3.7603 | 0.5 | 95000 | 3.3961 |
3.0006 | 0.53 | 100000 | 3.3877 |
2.9475 | 0.55 | 105000 | 3.4514 |
3.1159 | 0.58 | 110000 | 3.4359 |
2.8221 | 0.61 | 115000 | 3.5010 |
3.1381 | 0.63 | 120000 | 3.4727 |
2.8698 | 0.66 | 125000 | 3.5077 |
2.8787 | 0.68 | 130000 | 3.4998 |
3.1428 | 0.71 | 135000 | 3.5375 |
3.338 | 0.74 | 140000 | 3.5433 |
3.1677 | 0.76 | 145000 | 3.5527 |
2.9904 | 0.79 | 150000 | 3.5842 |
3.2251 | 0.82 | 155000 | 3.5973 |
3.2135 | 0.84 | 160000 | 3.5730 |
3.3155 | 0.87 | 165000 | 3.5862 |
3.4767 | 0.89 | 170000 | 3.5912 |
2.9137 | 0.92 | 175000 | 3.5920 |
3.1084 | 0.95 | 180000 | 3.6024 |
3.3176 | 0.97 | 185000 | 3.6066 |
3.2806 | 1.0 | 190000 | 3.6106 |
Framework versions
- Transformers 4.26.0
- Pytorch 1.13.1
- Datasets 2.9.0
- Tokenizers 0.13.2