translation

bat-eng

Benchmarks

testset BLEU chr-F
newsdev2017-enlv-laveng.lav.eng 27.5 0.566
newsdev2019-enlt-liteng.lit.eng 27.8 0.557
newstest2017-enlv-laveng.lav.eng 21.1 0.512
newstest2019-lten-liteng.lit.eng 30.2 0.592
Tatoeba-test.lav-eng.lav.eng 51.5 0.687
Tatoeba-test.lit-eng.lit.eng 55.1 0.703
Tatoeba-test.multi.eng 50.6 0.662
Tatoeba-test.prg-eng.prg.eng 1.0 0.159
Tatoeba-test.sgs-eng.sgs.eng 16.5 0.265

System Info: