language:

tags:

Dataset:

Base Model - T5-Base

Google's T5 The T5 was built by the Google team in order to create a general-purpose model that can understand the text. The basic idea behind t5 was to deal with the text processing problem as a “text-to-text” problem, i.e. taking the text as input and producing new text as output.

Baseline Preprocessing

Baseline Preprocessing This code repository serves as a supplementary for the main repository, which can be used to do basic preprocessing of the Totto dataset.

Fine-tuning

We used the T5 for the conditional generation model to fine-tune with, 10000 steps with the ToTTo dataset using BLEU as a metric.