A Romanian BERT model, initialized from bert-base-romanian-cased-v1 and pretrained on the MARCELL v2.0 corpus of legal documents for 24h with the principles following the paper by Peter Izsak, Moshe Berchansky, Omer Levy, How to Train BERT with an Academic Budget