hmByT5 - Preliminary Language Models

Preliminary Historic Multilingual and Monolingual ByT5 Models. Following languages are currently covered:

More details can be found in our GitHub repository.

Pretraining

We use the official JAX/FLAX example in Hugging Face Transformers to pretrain a ByT5 model on a single v3-8 TPU. Details about the training can be found here.

Evaluation on Downstream Tasks (NER)

We evaluated the hmByT5 model on downstream tasks:

Model English AjMC German AjMC French AjMC Finnish NewsEye Swedish NewsEye Dutch ICDAR French ICDAR Avg.
hmbyt5-preliminary/byt5-small-english-german-french-finnish-swedish-dutch 85.51 ± 0.68 87.58 ± 0.39 84.39 ± 0.83 55.46 ± 1.99 73.38 ± 2.45 84.80 ± 0.44 75.97 ± 0.55

Acknowledgements

Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC). Many Thanks for providing access to the TPUs ❤️