Bert Online and Parliament Discussions (bert-discussions-online-parliament-en)

This model is a fine-tuned version of the BERT base model. It was introduced in this paper.

Model description

The BERT base language model was fine-tuned a sample of three corpora:

The language is English (en). The model was trained on a sample of 3,545,498 sentences in total.