bert

MahaNews-LPC-BERT

MahaNews-LPC-BERT is a MahaBERT(<a href="https://huggingface.co/l3cube-pune/marathi-bert-v2">l3cube-pune/marathi-bert-v2</a>) model fine-tuned on full L3Cube-MahaNews-LPC Corpus, a Marathi medium document / paragraph classification dataset. <br> It is a topic identification cum medium paragraph classification model with 12 output categories <br> [dataset link] (https://github.com/l3cube-pune/MarathiNLP)

More details on the dataset, models, and baseline results can be found in our [paper] (coming soon) <br> Citing:

@article{joshi2022l3cube,
  title={L3cube-mahanlp: Marathi natural language processing datasets, models, and library},
  author={Joshi, Raviraj},
  journal={arXiv preprint arXiv:2205.14728},
  year={2022}
}

Other Marathi Sentiment models from MahaNews family are shared here:<br>

<a href="https://huggingface.co/l3cube-pune/marathi-topic-long-doc"> MahaNews-LDC-BERT (long documents) </a> <br> <a href="https://huggingface.co/l3cube-pune/marathi-topic-short-doc"> MahaNews-SHC-BERT (short text) </a> <br> <a href="https://huggingface.co/l3cube-pune/marathi-topic-medium-doc"> MahaNews-LPC-BERT (medium paragraphs) </a> <br> <a href="https://huggingface.co/l3cube-pune/marathi-topic-all-doc"> MahaNews-All-BERT (all document lengths) </a> <br>