German-English Code-Switching Identification

The Tongueswitcher BERT model finetuned for German-English identification. It was introduced in this paper. This model is case sensitive.

Overview

Hyperparameters

batch_size = 16
epochs = 3
n_steps = 789
max_seq_len = 512
learning_rate = 3e-5
weight_decay = 0.01
Adam beta = (0.9, 0.999)
lr_schedule = LinearWarmup
seed = 2021

Authors

BibTeX entry and citation info

@inproceedings{sterner2023tongueswitcher,
  author    = {Igor Sterner and Simone Teufel},
  title     = {TongueSwitcher: Fine-Grained Identification of German-English Code-Switching},
  booktitle = {Sixth Workshop on Computational Approaches to Linguistic Code-Switching},
  publisher = {Empirical Methods in Natural Language Processing},
  year      = {2023},
}