ruBert-base
Model was trained by SberDevices team.
- Task:
mask filling
- Type:
encoder
- Tokenizer:
bpe
- Dict size:
120 138
- Num Parameters:
178 M
- Training Data Volume
30 GB
Authors
- NLP core team RnD Telegram channel:
- Dmitry Zmitrovich
Model was trained by SberDevices team.
mask filling
encoder
bpe
120 138
178 M
30 GB