<br> <br>

ARMHuBERT Model Card

This repo contains the models from our paper Recycle-and-Distill: Universal Compression Strategy for Transformer-based Speech SSL Models with Attention Map Reusing and Masking Distillation, INTERSPEECH 2023.

Model details

Model type: ARMHuBERT is an open-source speech SSL model distilled from HuBERT-Base, by attention map reusing and masking distillation. We also provide the model checkpoints of MaskHuBERT (without attention map reusing) and ARMwavLM (wavLM-Base teacher).

License: Apache 2.0 License

Where to send questions or comments about the model: https://github.com/sungnyun/ARMHuBERT/issues

Training dataset

Pretraining data: LibriSpeech

<br>

More detials are in our github, https://github.com/sungnyun/ARMHuBERT.