ruT5-base
Model was trained by SberDevices.
- Task:
text2text generation - Type:
encoder-decoder - Tokenizer:
bpe - Dict size:
32 101 - Num Parameters:
222 M - Training Data Volume
300 GB
Authors
- NLP core team RnD Telegram channel:
- Dmitry Zmitrovich
Model was trained by SberDevices.
text2text generationencoder-decoderbpe32 101222 M300 GB