bert

Introducing BEREL 2.0 - New and Improved BEREL: BERT Embeddings for Rabbinic-Encoded Language

When using BEREL 2.0, please reference:

Avi Shmidman, Joshua Guedalia, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Eli Handel, Moshe Koppel, "Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language", Aug 2022 [arXiv:2208.01875]

  1. Usage:
from transformers import AutoTokenizer, BertForMaskedLM

tokenizer = AutoTokenizer.from_pretrained('dicta-il/BEREL_2.0')
model = BertForMaskedLM.from_pretrained('dicta-il/BEREL_2.0')

# for evaluation, disable dropout
model.eval()

NOTE: This code will not work and provide bad results if you use BertTokenizer. Please use AutoTokenizer or BertTokenizerFast.

  1. Demo site: You can experiment with the model in a GUI interface here: https://dicta-bert-demo.netlify.app/?genre=rabbinic