generated_from_keras_callback

distilbert_finetuned_newsgroups

This model is a fine-tuned version of distilbert-base-uncased on 20 Newsgroups dataset.

Training procedure

Used 10% of the training set as the validation set.

Training hyperparameters

The following hyperparameters were used during training:

Training results

Achieves 83.13% accuracy on Test set.

Framework versions