<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
pegasus-x-base-arxiv
This model is a fine-tuned version of google/pegasus-x-base on the arxiv-summarization dataset. It achieves the following results on the evaluation set:
- Loss: nan
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.002
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
33079752.704 | 0.02 | 1000 | 20.8725 |
137412599.808 | 0.04 | 2000 | 20.8725 |
4579383.808 | 0.06 | 3000 | 20.8725 |
7668121.088 | 0.08 | 4000 | 20.8725 |
2206938.88 | 0.1 | 5000 | 20.8725 |
733122.432 | 0.12 | 6000 | 20.8725 |
91004.528 | 0.14 | 7000 | 20.8725 |
180892.96 | 0.16 | 8000 | 20.8725 |
55383486.464 | 0.18 | 9000 | 20.8725 |
3599870.464 | 0.2 | 10000 | 20.8725 |
2522732.288 | 0.22 | 11000 | 20.8725 |
27543791.616 | 0.24 | 12000 | 20.8725 |
55559.692 | 0.26 | 13000 | 20.8725 |
30380011.52 | 0.28 | 14000 | 20.8725 |
6128657.92 | 0.3 | 15000 | 20.8725 |
1285824.128 | 0.32 | 16000 | 20.8725 |
50138685.44 | 0.33 | 17000 | 20.8725 |
1199289.088 | 0.35 | 18000 | 20.8725 |
439883.712 | 0.37 | 19000 | 20.8725 |
833669562.368 | 0.39 | 20000 | 20.8725 |
3537328.896 | 0.41 | 21000 | 20.8725 |
602138869.76 | 0.43 | 22000 | 20.8725 |
32460343.296 | 0.45 | 23000 | 20.8725 |
438640345.088 | 0.47 | 24000 | 20.8725 |
600373.568 | 0.49 | 25000 | 20.8725 |
504883.84 | 0.51 | 26000 | 20.8725 |
2854624.0 | 0.53 | 27000 | 20.8725 |
3245419.264 | 0.55 | 28000 | 20.8725 |
5536696.32 | 0.57 | 29000 | 20.8725 |
157420208.128 | 0.59 | 30000 | 20.8725 |
6593876.992 | 0.61 | 31000 | 20.8725 |
4113536.512 | 0.63 | 32000 | 20.8725 |
425032777.728 | 0.65 | 33000 | 20.8725 |
1103064.704 | 0.67 | 34000 | 20.8725 |
3796103.424 | 0.69 | 35000 | 20.8725 |
709982.272 | 0.71 | 36000 | 20.8725 |
635942.592 | 0.73 | 37000 | 20.8725 |
12475876.352 | 0.75 | 38000 | 20.8725 |
977161.344 | 0.77 | 39000 | 20.8725 |
215117201.408 | 0.79 | 40000 | 20.8725 |
2228061.184 | 0.81 | 41000 | 20.8725 |
1511801.472 | 0.83 | 42000 | 20.8725 |
374335.392 | 0.85 | 43000 | 20.8725 |
551274414.08 | 0.87 | 44000 | 20.8725 |
14776.275 | 0.89 | 45000 | 20.8725 |
1016692.48 | 0.91 | 46000 | 20.8725 |
1531294.592 | 0.93 | 47000 | 20.8725 |
14540307.456 | 0.95 | 48000 | 20.8725 |
187237.904 | 0.97 | 49000 | 20.8725 |
1191666.432 | 0.99 | 50000 | 20.8725 |
37780086.784 | 1.0 | 51000 | 20.8725 |
278421.152 | 1.02 | 52000 | 20.8725 |
32245749.76 | 1.04 | 53000 | 20.8725 |
8463674.368 | 1.06 | 54000 | 20.8725 |
12945154.048 | 1.08 | 55000 | 20.8725 |
43897995.264 | 1.1 | 56000 | 20.8725 |
1705616.384 | 1.12 | 57000 | 20.8725 |
14537140.224 | 1.14 | 58000 | 20.8725 |
2372431.104 | 1.16 | 59000 | 20.8725 |
13542864.896 | 1.18 | 60000 | 20.8725 |
622085079.04 | 1.2 | 61000 | 20.8725 |
41270423.552 | 1.22 | 62000 | 20.8725 |
676841.664 | 1.24 | 63000 | 20.8725 |
186303217.664 | 1.26 | 64000 | 20.8725 |
89201172.48 | 1.28 | 65000 | 20.8725 |
197829165.056 | 1.3 | 66000 | 20.8725 |
7490446.336 | 1.32 | 67000 | 20.8725 |
44490223.616 | 1.34 | 68000 | nan |
0.0 | 1.36 | 69000 | nan |
0.0 | 1.38 | 70000 | nan |
0.0 | 1.4 | 71000 | nan |
0.0 | 1.42 | 72000 | nan |
0.0 | 1.44 | 73000 | nan |
0.0 | 1.46 | 74000 | nan |
0.0 | 1.48 | 75000 | nan |
0.0 | 1.5 | 76000 | nan |
0.0 | 1.52 | 77000 | nan |
0.0 | 1.54 | 78000 | nan |
0.0 | 1.56 | 79000 | nan |
0.0 | 1.58 | 80000 | nan |
0.0 | 1.6 | 81000 | nan |
0.0 | 1.62 | 82000 | nan |
0.0 | 1.64 | 83000 | nan |
0.0 | 1.65 | 84000 | nan |
0.0 | 1.67 | 85000 | nan |
0.0 | 1.69 | 86000 | nan |
0.0 | 1.71 | 87000 | nan |
0.0 | 1.73 | 88000 | nan |
0.0 | 1.75 | 89000 | nan |
0.0 | 1.77 | 90000 | nan |
0.0 | 1.79 | 91000 | nan |
0.0 | 1.81 | 92000 | nan |
0.0 | 1.83 | 93000 | nan |
0.0 | 1.85 | 94000 | nan |
0.0 | 1.87 | 95000 | nan |
0.0 | 1.89 | 96000 | nan |
0.0 | 1.91 | 97000 | nan |
0.0 | 1.93 | 98000 | nan |
0.0 | 1.95 | 99000 | nan |
0.0 | 1.97 | 100000 | nan |
0.0 | 1.99 | 101000 | nan |
0.0 | 2.01 | 102000 | nan |
0.0 | 2.03 | 103000 | nan |
0.0 | 2.05 | 104000 | nan |
0.0 | 2.07 | 105000 | nan |
0.0 | 2.09 | 106000 | nan |
0.0 | 2.11 | 107000 | nan |
0.0 | 2.13 | 108000 | nan |
0.0 | 2.15 | 109000 | nan |
0.0 | 2.17 | 110000 | nan |
0.0 | 2.19 | 111000 | nan |
0.0 | 2.21 | 112000 | nan |
0.0 | 2.23 | 113000 | nan |
0.0 | 2.25 | 114000 | nan |
0.0 | 2.27 | 115000 | nan |
0.0 | 2.29 | 116000 | nan |
0.0 | 2.3 | 117000 | nan |
0.0 | 2.32 | 118000 | nan |
0.0 | 2.34 | 119000 | nan |
0.0 | 2.36 | 120000 | nan |
0.0 | 2.38 | 121000 | nan |
0.0 | 2.4 | 122000 | nan |
0.0 | 2.42 | 123000 | nan |
0.0 | 2.44 | 124000 | nan |
0.0 | 2.46 | 125000 | nan |
0.0 | 2.48 | 126000 | nan |
0.0 | 2.5 | 127000 | nan |
0.0 | 2.52 | 128000 | nan |
0.0 | 2.54 | 129000 | nan |
0.0 | 2.56 | 130000 | nan |
0.0 | 2.58 | 131000 | nan |
0.0 | 2.6 | 132000 | nan |
0.0 | 2.62 | 133000 | nan |
0.0 | 2.64 | 134000 | nan |
0.0 | 2.66 | 135000 | nan |
0.0 | 2.68 | 136000 | nan |
0.0 | 2.7 | 137000 | nan |
0.0 | 2.72 | 138000 | nan |
0.0 | 2.74 | 139000 | nan |
0.0 | 2.76 | 140000 | nan |
0.0 | 2.78 | 141000 | nan |
0.0 | 2.8 | 142000 | nan |
0.0 | 2.82 | 143000 | nan |
0.0 | 2.84 | 144000 | nan |
0.0 | 2.86 | 145000 | nan |
0.0 | 2.88 | 146000 | nan |
0.0 | 2.9 | 147000 | nan |
0.0 | 2.92 | 148000 | nan |
0.0 | 2.94 | 149000 | nan |
0.0 | 2.96 | 150000 | nan |
0.0 | 2.97 | 151000 | nan |
0.0 | 2.99 | 152000 | nan |
Framework versions
- Transformers 4.32.1
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.2