CYBERT

BERT model dedicated to the domain of cyber security. The model has been trained on a corpus of high-quality cyber security and computer science text and is unlikely to work outside this domain.

##Model architecture

The model architecture used is original Roberta and tokenizer to train the corpus is Byte Level.

##Hardware

The model is trained on GPU NVIDIA-SMI 510.54