t5-base finetuned on xsum dataset

train args<br>

max_input_length: 512<br> max_tgt_length: 128<br> epoch: 3<br> optimizer: AdamW<br> lr: 2e-5<br> weight_decay: 1e-3<br> fp16: False<br> prefix: "summarize: "<br>

performance<br>

train_loss 0.5976<br> eval_loss: 0.5340<br> eval_rouge1: 34.6791<br> eval_rouge2: 12.8236<br> eval_rougeL: 28.1201<br> eval_rougeLsum: 28.1241<br>

usage<br>

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM<br>

dependency<br>

trained with transformers==4.24<br> compatible with transformers==3.0.2<br>