Model Card for Model arman-longformer-8k

<!-- Provide a quick summary of what the model is/does. --> This project use Longformer's attention mechanism to alireza7/ARMAN-MSR-persian-base in order to perform abstractive summarization on long documents. so new model can accept 8K tokens (rather than 512 tokens).it should be fine-tuned for summarization tasks.

converting code is available in github repository