rut5-base-detox-v2
Model was fine-tuned from sberbank-ai/ruT5-base on parallel detoxification corpus.
- Task:
text2text generation
- Type:
encoder-decoder
- Tokenizer:
bpe
- Dict size:
32 101
- Num Parameters:
222 M
Model was fine-tuned from sberbank-ai/ruT5-base on parallel detoxification corpus.
text2text generation
encoder-decoder
bpe
32 101
222 M