Bert Online and Parliament Discussions (bert-discussions-online-parliament-en)
This model is a fine-tuned version of the BERT base model. It was introduced in this paper.
Model description
The BERT base language model was fine-tuned a sample of three corpora:
- on the Webis-CMV-20 corpus
- on the args.me corpus
- on the Europarl corpus
The language is English (en). The model was trained on a sample of 3,545,498 sentences in total.