<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
NajiAboo/prognosis-distilbert-base-uncased-finetuned-cardio-qa
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.1314
- Train End Logits Accuracy: 0.9575
- Train Start Logits Accuracy: 0.9591
- Validation Loss: 1.5573
- Validation End Logits Accuracy: 0.7503
- Validation Start Logits Accuracy: 0.7457
- Epoch: 7
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 494400, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch |
---|---|---|---|---|---|---|
1.4969 | 0.6659 | 0.6647 | 1.0266 | 0.7456 | 0.7428 | 0 |
0.8820 | 0.7667 | 0.7702 | 0.9726 | 0.7573 | 0.7542 | 1 |
0.6269 | 0.8266 | 0.8287 | 1.0440 | 0.7601 | 0.7528 | 2 |
0.4406 | 0.8711 | 0.8748 | 1.0837 | 0.7590 | 0.7540 | 3 |
0.2999 | 0.9087 | 0.9122 | 1.1957 | 0.7572 | 0.7510 | 4 |
0.2168 | 0.9317 | 0.9347 | 1.4545 | 0.7465 | 0.7428 | 5 |
0.1623 | 0.9485 | 0.9501 | 1.4684 | 0.7560 | 0.7529 | 6 |
0.1314 | 0.9575 | 0.9591 | 1.5573 | 0.7503 | 0.7457 | 7 |
Framework versions
- Transformers 4.29.2
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3