<img src="https://huggingface.co/dlicari/lsg16k-Italian-Legal-BERT/resolve/main/ITALIAN_LEGAL_BERT-LSG.jpg" width="600"/>

LSG16K-Italian-LEGAL-BERT

Local-Sparse-Global version of ITALIAN-LEGAL-BERT by replacing the full attention in the encoder part using the LSG converter script (https://github.com/ccdv-ai/convert_checkpoint_to_lsg). We used the LSG attention with 16,384 maximum sequence length, 7 global tokens, 128 local block size, 128 sparse block size, 2 sparsity factors, 'norm' sparse selection pattern (select the highest norm tokens).