<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
Tweets disaster detection model
This model was trained from part of Disaster Tweet Corpus 2020 (Analysis of Filtering Models for Disaster-Related Tweets, Wiegmann,M. et al, 2020) dataset It achieves the following results on the evaluation set:
- Train Loss: 0.1400
- Train Accuracy: 0.9516
- Validation Loss: 0.1995
- Validation Accuracy: 0.9324
- Epoch: 2
Model description
Labels <br> not disaster --- 0 <br> disaster --- 1
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: <br> batch_size = 16 <br> num_epochs = 5 <br> batches_per_epoch = len(tokenized_tweet["train"])//batch_size <br> total_train_steps = int(batches_per_epoch * num_epochs) <br> optimizer, schedule = create_optimizer(init_lr=2e-5, num_warmup_steps=0, num_train_steps=total_train_steps)
- training_precision: float32
Framework versions
- Transformers 4.16.2
- TensorFlow 2.9.2
- Datasets 2.4.0
- Tokenizers 0.12.1
How to use it
from transformers import AutoTokenizer, TFAutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("sacculifer/dimbat_disaster_distilbert")
model = TFAutoModelForSequenceClassification.from_pretrained("sacculifer/dimbat_disaster_distilbert")