generated_from_keras_callback

distilbert-clf-20newsgroups

This model is a fine-tuned version of distilbert-base-uncased on 20newsgroups. It achieves the following results on the evaluation set:

Model description

Intended uses & limitations

Training and evaluation data

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Framework versions