xlm-roberta-longformer-base-16384

⚠️ This is just the PyTorch version of hyperonym/xlm-roberta-longformer-base-16384 without any modifications.

xlm-roberta-longformer is a multilingual Longformer initialized with XLM-RoBERTa's weights without further pretraining. It is intended to be fine-tuned on a downstream task.

The notebook for replicating the model is available on GitHub: https://github.com/hyperonym/dirge/blob/master/models/xlm-roberta-longformer/convert.ipynb