generated_from_keras_callback

<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->

distilbert_classifier_newsgroup

This model is a fine-tuned version of distilbert-base-uncased on the 20 Newsgroups data set that is a collection of approximately 20,000 newsgroup documents, partitioned (nearly) evenly across 20 different newsgroups. It achieves the following results on the evaluation set: loss: 0.5660 - accuracy: 0.8371

Training and evaluation data

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Framework versions