bert

Update 2023-5-23: This model is BEREL version 1.0. We are now happy to provide a much improved BEREL_2.0.

Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language

When using BEREL, please reference:

Avi Shmidman, Joshua Guedalia, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Eli Handel, Moshe Koppel, "Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language", Aug 2022 [arXiv:2208.01875]

  1. Usage:
from transformers import AutoTokenizer, BertForMaskedLM

tokenizer = AutoTokenizer.from_pretrained('dicta-il/BEREL')
model = BertForMaskedLM.from_pretrained('dicta-il/BEREL')

# for evaluation, disable dropout
model.eval()

NOTE: This code will not work and provide bad results if you use BertTokenizer. Please use AutoTokenizer or BertTokenizerFast.

  1. Demo site: You can experiment with the model in a GUI interface here: https://dicta-bert-demo.netlify.app/?genre=rabbinic