bart

KoBART-base-v2

With the addition of chatting data, the model is trained to handle the semantics of sequences longer than KoBART.

from transformers import PreTrainedTokenizerFast, BartModel

tokenizer = PreTrainedTokenizerFast.from_pretrained('hyunwoongko/kobart')
model = BartModel.from_pretrained('hyunwoongko/kobart')

Performance

NSMC

hyunwoongko/kobart