The RSE-BERT-large-USEB is trained with 6 relations including:
- entailment
- duplicate_question
- paraphrase
- same_caption
- qa_entailment
- same_sent
The BERT-large-uncased model is used as initialization.
It can be used ideally for USEB datasets.
The RSE-BERT-large-USEB is trained with 6 relations including:
The BERT-large-uncased model is used as initialization.
It can be used ideally for USEB datasets.