generated_from_trainer

How to use

This model can be easily loaded using the AutoModelForCausalLM functionality:

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("keonju/gpt-j-fin")
model = AutoModelForCausalLM.from_pretrained("keonju/gpt-j-fin")

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

gpt-j-fin

This model is a fine-tuned version of EleutherAI/gpt-j-6B on the None dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Framework versions