Model Details
Model Description
<!-- Provide a longer summary of what this model is. -->
This is a Machine Translation model, finetuned from NLLB-200's distilled 1.3B model, it is meant to be used in machine translation for education-related data.
- Finetuning code repository: the code used to finetune this model can be found here
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
How to Get Started with the Model
Use the code below to get started with the model.
Training Procedure
The model was finetuned on three datasets; a general purpose dataset, a tourism, and an education dataset.
The model was finetuned in two phases.
Phase one:
- General purpose dataset
- Education dataset
- Tourism dataset
Phase two:
- Education dataset
Other than the dataset changes between phase one, and phase two finetuning; no other hyperparameters were modified. In both cases, the model was trained on an A100 40GB GPU for two epochs.
Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
Testing Data
<!-- This should link to a Data Card if possible. -->
Metrics
Model performance was measured using BLEU, spBLEU, and chrF++ metrics.
Results
<!-- [More Information Needed] -->