multilingual PyTorch Transformers gpt3 gpt2 Deepspeed Megatron

mGPT: fine-tune on message data MWE

This model is a fine-tuned version of sberbank-ai/mGPT on 80k messages. Trained for one epoch, will be updated in a (separate) model repo later.

Model description

Usage in python

Install the transformers library if you don't have it:

pip install -U transformers

load the model into a pipeline object:

from transformers import pipeline
import torch
device = 'cuda' if torch.cuda.is_available() else 'cpu'
my_chatbot = pipeline('text-generation', 
                      'pszemraj/mGPT-Peter-mwe',
                      device=0 if device == 'cuda' else -1,
                    )

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Framework versions