generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

KoT5_Translate_ko_jp

This model is a fine-tuned version of KETI-AIR/ke-t5-base on the None dataset. It achieves the following results on the evaluation set:

Model description

한국어-일본어 번역기 모델을 위해서 만들었습니다. KETI-AIR님이 공유해주신 ke-t5-base에 Text2Text Task로 한국어-일본어 Translate를 위해서 Fine-Tuning 진행한 모델입니다.

Training and evaluation data

noahkim/Kor_Jpn_Translation_Dataset 제가 AIHub에서 다운 받아 허깅페이스에 공유한 한국어-일본어 문화 분야 이중 말뭉치를 Fine-Tuning 데이터셋으로 활용했습니다.

Supported Tasks and Leaderboards

Translation

Languages

Kor Jpan

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Bleu
3.8739 0.08 500 1.7216 3.3261
1.2621 0.15 1000 0.6792 28.6184
0.7413 0.23 1500 0.5153 35.9355
0.635 0.3 2000 0.4807 38.4874
0.5643 0.38 2500 0.4322 40.7997
0.5137 0.46 3000 0.4027 41.9025
0.4806 0.53 3500 0.3862 42.5947
0.4552 0.61 4000 0.3721 42.9976
0.4395 0.69 4500 0.3585 43.5369
0.4213 0.76 5000 0.3487 44.0028
0.411 0.84 5500 0.3418 44.1845
0.3992 0.91 6000 0.3348 44.3701
0.3966 0.99 6500 0.3331 44.5463

Framework versions