Model Description

ClinicalDistilBERT was developed by training the BioDistilBERT-cased model in a continual learning fashion for 3 epochs using a total batch size of 192 on the MIMIC-III notes dataset.

Initialisation

We initialise our model with the pre-trained checkpoints of the BioDistilBERT-cased model available on Huggingface.

Architecture

In this model, the size of the hidden dimension and the embedding layer are both set to 768. The vocabulary size is 28996. The number of transformer layers is 6 and the expansion rate of the feed-forward layer is 4. Overall, this model has around 65 million parameters.

Citation

If you use this model, please consider citing the following paper:

@misc{https://doi.org/10.48550/arxiv.2302.04725,
  doi = {10.48550/ARXIV.2302.04725},
  url = {https://arxiv.org/abs/2302.04725},
  author = {Rohanian, Omid and Nouriborji, Mohammadmahdi and Jauncey, Hannah and Kouchaki, Samaneh and Group, ISARIC Clinical Characterisation and Clifton, Lei and Merson, Laura and Clifton, David A.},
  keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences, I.2.7, 68T50},
  title = {Lightweight Transformers for Clinical Natural Language Processing},
  publisher = {arXiv},
  year = {2023},
  copyright = {arXiv.org perpetual, non-exclusive license}
}