run-deu
-
source group: Rundi
-
target group: German
-
OPUS readme: run-deu
-
model: transformer-align
-
source language(s): run
-
target language(s): deu
-
model: transformer-align
-
pre-processing: normalization + SentencePiece (spm4k,spm4k)
-
download original weights: opus-2020-06-16.zip
-
test set translations: opus-2020-06-16.test.txt
-
test set scores: opus-2020-06-16.eval.txt
Benchmarks
testset | BLEU | chr-F |
---|---|---|
Tatoeba-test.run.deu | 17.1 | 0.344 |
System Info:
-
hf_name: run-deu
-
source_languages: run
-
target_languages: deu
-
opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/run-deu/README.md
-
original_repo: Tatoeba-Challenge
-
tags: ['translation']
-
languages: ['rn', 'de']
-
src_constituents: {'run'}
-
tgt_constituents: {'deu'}
-
src_multilingual: False
-
tgt_multilingual: False
-
prepro: normalization + SentencePiece (spm4k,spm4k)
-
url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/run-deu/opus-2020-06-16.zip
-
url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/run-deu/opus-2020-06-16.test.txt
-
src_alpha3: run
-
tgt_alpha3: deu
-
short_pair: rn-de
-
chrF2_score: 0.344
-
bleu: 17.1
-
brevity_penalty: 0.961
-
ref_len: 10562.0
-
src_name: Rundi
-
tgt_name: German
-
train_date: 2020-06-16
-
src_alpha2: rn
-
tgt_alpha2: de
-
prefer_old: False
-
long_pair: run-deu
-
helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
-
transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
-
port_machine: brutasse
-
port_time: 2020-08-21-14:41