<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
Shmendel/distilbert-base-uncased-finetuned-dt-rally-speeches
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 1.7297
 - Validation Loss: 1.5226
 - Epoch: 9
 
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 2e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': -750, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'passive_serialization': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
 - training_precision: float32
 
Training results
| Train Loss | Validation Loss | Epoch | 
|---|---|---|
| 1.8323 | 1.7319 | 0 | 
| 1.8230 | 1.7765 | 1 | 
| 1.8213 | 1.7305 | 2 | 
| 1.8191 | 1.6551 | 3 | 
| 1.8031 | 1.6772 | 4 | 
| 1.7946 | 1.6388 | 5 | 
| 1.7777 | 1.6475 | 6 | 
| 1.7552 | 1.6849 | 7 | 
| 1.7392 | 1.6441 | 8 | 
| 1.7297 | 1.5226 | 9 | 
Framework versions
- Transformers 4.25.1
 - TensorFlow 2.7.0
 - Datasets 2.11.0
 - Tokenizers 0.13.2