albert-small-kor-cross-encoder-v1

Training

모델 korsts klue-sts glue(stsb) stsb_multi_mt(en)
albert-small-kor-cross-encoder-v1 0.8455 0.8526 0.8513 0.7976
klue-cross-encoder-v1 0.8262 0.8833 0.8512 0.7889
kpf-cross-encoder-v1 0.8799 0.9133 0.8626 0.8027

Usage and Performance

Pre-trained models can be used like this:

from sentence_transformers import CrossEncoder
model = CrossEncoder('bongsoo/albert-small-kor-cross-encoder-v1')
scores = model.predict([('오늘 날씨가 좋다', '오늘 등산을 한다'), ('오늘 날씨가 흐리다', '오늘 비가 내린다')])
print(scores)
[0.45417202 0.6294121 ]

The model will predict scores for the pairs ('Sentence 1', 'Sentence 2') and ('Sentence 3', 'Sentence 4').

You can use this model also without sentence_transformers and by just using Transformers AutoModel class