ArXiv GPT-2 checkpoint

This is a GPT-2 small checkpoint for PyTorch. It is the official gpt2-small finetuned to ArXiv paper on physics fields.

Training data

This model was trained on a subset of ArXiv papers that were parsed from PDF to txt. The resulting data is made of 130MB of text, mostly from quantum physics (quant-ph) and other physics sub-fields.