GPT-2 (355M model) finetuned on 0.5m PubMed abstracts. Used in the writemeanabstract.com and the following preprint:

Papanikolaou, Yannis, and Andrea Pierleoni. "DARE: Data Augmented Relation Extraction with GPT-2." arXiv preprint arXiv:2004.13845 (2020).