ruT5-base
Model was trained by SberDevices.
- Task:
text2text generation
- Type:
encoder-decoder
- Tokenizer:
bpe
- Dict size:
32 101
- Num Parameters:
222 M
- Training Data Volume
300 GB
Authors
- NLP core team RnD Telegram channel:
- Dmitry Zmitrovich
Model was trained by SberDevices.
text2text generation
encoder-decoder
bpe
32 101
222 M
300 GB