ZeoBERT

This is the pretrained BERT model trained on text about zeolite.

Based on roberta-v3-large, the model was retrained using the titles and abstracts of 170,000 molecular sieve-related papers.

ZeoBERT has its own wordpiece vocabulary that's built to best match the training corpus.