Bert Online Discussions (bert-web-discussions-en)

This model is a fine-tuned version of the BERT base model. It was introduced in this paper.

Model description

The BERT base language model was fine-tuned on the Webis-CMV-20 corpus and on the args.me corpus. The model was trained on a sample of 2,469,026 sentences in total.