The RSE-RoBERTa-large-10-rel is trained with 10 relations including:

  1. entailment
  2. contradiction
  3. neutral
  4. duplicate_question
  5. non_duplicate_question
  6. paraphrase
  7. same_caption
  8. qa_entailment
  9. qa_not_entailment
  10. same_sent

The RoBERTa-large model is used as initialization.

It can be used to infer all ten different relations.