distilbart summarization

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

MLQ-distilbart-bbc

This model is a fine-tuned version of sshleifer/distilbart-cnn-12-6 on the BBC News Summary dataset (https://www.kaggle.com/pariza/bbc-news-summary).

The model has been generated as part of the in-lab practice of Deep NLP course currently held at Politecnico di Torino.

Training parameters: