Model Details
Model Description
<!-- Provide a longer summary of what this model is. -->
This is a Machine Translation model, finetuned from NLLB-200's distilled 1.3B model, it is meant to be used in machine translation for tourism-related data.
- Finetuning code repository: the code used to finetune this model can be found here
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
How to Get Started with the Model
Use the code below to get started with the model.
Training Procedure
The model was finetuned on three datasets; a general purpose dataset, a tourism, and an education dataset.
The model was finetuned in two phases.
Phase one:
- General purpose dataset
- Education dataset
- Tourism dataset
Phase two:
- Tourism dataset
Other than the dataset changes between phase one, and phase two finetuning; no other hyperparameters were modified. In both cases, the model was trained on an A100 40GB GPU for two epochs.
Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
<!-- This should link to a Data Card if possible. -->
Metrics
Model performance was measured using BLEU, spBLEU, TER, and chrF++ metrics.
Results
Lang. Direction | BLEU | spBLEU | chrf++ | TER |
---|---|---|---|---|
Eng -> Kin | 28.37 | 40.62 | 56.48 | 59.71 |
Kin -> Eng | 42.54 | 44.84 | 61.54 | 43.87 |
<!-- [More Information Needed] --> |