ruBert-large
Model was trained by SberDevices team.
- Task:
mask filling - Type:
encoder - Tokenizer:
bpe - Dict size:
120 138 - Num Parameters:
427 M - Training Data Volume
30 GB
Authors
- NLP core team RnD Telegram channel:
- Dmitry Zmitrovich
Model was trained by SberDevices team.
mask fillingencoderbpe120 138427 M30 GB