mr en codemix

MeBERT-Mixed

MeBERT-Mixed-v2 is a Marathi-English code-mixed BERT model trained on Roman + Devanagari text. It is a MuRIL model fine-tuned on L3Cube-MeCorpus. <br> [dataset link] (https://github.com/l3cube-pune/MarathiNLP)

More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2306.14030)

Other models from MeBERT family: <br> <a href="https://huggingface.co/l3cube-pune/me-bert"> MeBERT </a> <br> <a href="https://huggingface.co/l3cube-pune/me-roberta"> MeRoBERTa </a> <br>

<a href="https://huggingface.co/l3cube-pune/me-bert-mixed"> MeBERT-Mixed </a> <br> <a href="https://huggingface.co/l3cube-pune/me-bert-mixed-v2"> MeBERT-Mixed-v2 </a> <br> <a href="https://huggingface.co/l3cube-pune/me-roberta-mixed"> MeRoBERTa-Mixed </a> <br>

<a href="https://huggingface.co/l3cube-pune/me-lid-roberta"> MeLID-RoBERTa </a> <br> <a href="https://huggingface.co/l3cube-pune/me-hate-roberta"> MeHate-RoBERTa </a> <br> <a href="https://huggingface.co/l3cube-pune/me-sent-roberta"> MeSent-RoBERTa </a> <br> <a href="https://huggingface.co/l3cube-pune/me-hate-bert"> MeHate-BERT </a> <br> <a href="https://huggingface.co/l3cube-pune/me-lid-bert"> MeLID-BERT </a> <br>

Citing:

@article{chavan2023my,
  title={My Boli: Code-mixed Marathi-English Corpora, Pretrained Language Models and Evaluation Benchmarks},
  author={Chavan, Tanmay and Gokhale, Omkar and Kane, Aditya and Patankar, Shantanu and Joshi, Raviraj},
  journal={arXiv preprint arXiv:2306.14030},
  year={2023}
}