Model Details
Model Description
<!-- Provide a longer summary of what this model is. -->
This is a Machine Translation model, finetuned from NLLB-200's distilled 1.3B model, it is meant to be used in machine translation for education-related data.
- Finetuning code repository: the code used to finetune this model can be found here
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
How to Get Started with the Model
Use the code below to get started with the model.
Training Procedure
The model was finetuned on three datasets; a general purpose dataset, a tourism, and an education dataset. The model was finetuned on an A100 40GB GPU for two epochs.
Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
Testing Data
<!-- This should link to a Data Card if possible. -->
Metrics
Model performance was measured using BLEU, spBLEU, TER, and chrF++ metrics.