generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

fake-gpt-2-17m

This model is a GPTJ (with 17,637,632 parameters) trained from scratch on a synthetic dataset (1gb of documents created in 4 fake languages, each with a formal and informal writing style) for 1 epoch.

It achieves the following results on the evaluation set:

Intended uses & limitations

This model is to be used as a base model for fine-tuning any language/task to probe the effectiveness of both pre-training on an algorithmically generated corpus and effectiveness of extremely small language models (SLMs?). It can only generate text based on its training data (which will be uploaded as a huggingface dataset soon).

Training and evaluation data

More information needed

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss
3.5175 1.0 46857 3.5592

Framework versions