gpt2-medium gpt2

GPT2-Medium pre-trained on cleaned Dutch mC4 🇳🇱

Datasets:

Tokenizer:

Training details:

Further fine-tuned on a Dutch book corpus.

Work in progress. Dec 2021-Jan2022