Model Description

The german-gpt2-romantik model was fine-tuned on dbmdz's german gpt-2 for specialization in poetry generation tasks.

Training Data

The data for training were hand-chosen poems from the German Romanticism Era (German: Romantik). In total there were 2,641 pieces of poems and 879,427 tokens in the corpus.

Poem Generation

Enter a starting sentence or phrase (also with the Inference API on the right), the model will output poem-like texts. You can try by entering "Der Garten der Freude", which outputs:

"Der Garten der Freude,

in dem mein Auge ruht,

wo Gott und die Sonne,

hier im Himmel,

zu allen Zeiten uns umgeben."