Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model using the alternating-training strategy described in section 3.2 of the paper. It was finetuned on the dataset BookSum (full-book setting).

The inference demo is disabled because you must add the Unlimiformer files to your repo before this model can handle unlimited length input! See the Unlimiformer GitHub for setup instructions.