<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
NewMergedSpamModelFinal
This model is a fine-tuned version of aubmindlab/bert-base-arabert on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0000
- Accuracy: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.3172 | 0.05 | 50 | 0.2393 | 0.91 |
0.1727 | 0.11 | 100 | 0.1355 | 0.95 |
0.1024 | 0.16 | 150 | 0.0967 | 0.97 |
0.1318 | 0.21 | 200 | 0.0372 | 0.99 |
0.1189 | 0.27 | 250 | 0.0906 | 0.97 |
0.1043 | 0.32 | 300 | 0.0070 | 1.0 |
0.0734 | 0.37 | 350 | 0.0127 | 1.0 |
0.077 | 0.43 | 400 | 0.0021 | 1.0 |
0.0955 | 0.48 | 450 | 0.0583 | 0.98 |
0.0836 | 0.53 | 500 | 0.0227 | 0.99 |
0.0428 | 0.58 | 550 | 0.0027 | 1.0 |
0.0348 | 0.64 | 600 | 0.0838 | 0.97 |
0.0752 | 0.69 | 650 | 0.0082 | 1.0 |
0.0525 | 0.74 | 700 | 0.0177 | 0.99 |
0.0758 | 0.8 | 750 | 0.0350 | 0.98 |
0.0711 | 0.85 | 800 | 0.0089 | 1.0 |
0.0429 | 0.9 | 850 | 0.1653 | 0.96 |
0.0787 | 0.96 | 900 | 0.0020 | 1.0 |
0.0717 | 1.01 | 950 | 0.0051 | 1.0 |
0.0283 | 1.06 | 1000 | 0.0014 | 1.0 |
0.0163 | 1.12 | 1050 | 0.0551 | 0.99 |
0.0635 | 1.17 | 1100 | 0.0009 | 1.0 |
0.0554 | 1.22 | 1150 | 0.0039 | 1.0 |
0.0242 | 1.28 | 1200 | 0.0021 | 1.0 |
0.0419 | 1.33 | 1250 | 0.0068 | 1.0 |
0.0138 | 1.38 | 1300 | 0.0004 | 1.0 |
0.0284 | 1.43 | 1350 | 0.0004 | 1.0 |
0.0334 | 1.49 | 1400 | 0.0006 | 1.0 |
0.0335 | 1.54 | 1450 | 0.0589 | 0.99 |
0.0294 | 1.59 | 1500 | 0.0178 | 0.99 |
0.0224 | 1.65 | 1550 | 0.0481 | 0.99 |
0.0454 | 1.7 | 1600 | 0.0018 | 1.0 |
0.0197 | 1.75 | 1650 | 0.0271 | 0.99 |
0.0375 | 1.81 | 1700 | 0.0650 | 0.99 |
0.0567 | 1.86 | 1750 | 0.0016 | 1.0 |
0.0312 | 1.91 | 1800 | 0.0589 | 0.98 |
0.0232 | 1.97 | 1850 | 0.0816 | 0.98 |
0.0285 | 2.02 | 1900 | 0.0014 | 1.0 |
0.0242 | 2.07 | 1950 | 0.0005 | 1.0 |
0.03 | 2.13 | 2000 | 0.0012 | 1.0 |
0.0142 | 2.18 | 2050 | 0.0007 | 1.0 |
0.0238 | 2.23 | 2100 | 0.0005 | 1.0 |
0.0069 | 2.28 | 2150 | 0.0001 | 1.0 |
0.0169 | 2.34 | 2200 | 0.0002 | 1.0 |
0.0136 | 2.39 | 2250 | 0.0055 | 1.0 |
0.0075 | 2.44 | 2300 | 0.0363 | 0.99 |
0.0132 | 2.5 | 2350 | 0.0005 | 1.0 |
0.0026 | 2.55 | 2400 | 0.0002 | 1.0 |
0.0001 | 2.6 | 2450 | 0.0001 | 1.0 |
0.0356 | 2.66 | 2500 | 0.0002 | 1.0 |
0.0238 | 2.71 | 2550 | 0.0006 | 1.0 |
0.0077 | 2.76 | 2600 | 0.0002 | 1.0 |
0.0153 | 2.82 | 2650 | 0.0011 | 1.0 |
0.0106 | 2.87 | 2700 | 0.0009 | 1.0 |
0.0216 | 2.92 | 2750 | 0.0001 | 1.0 |
0.0411 | 2.98 | 2800 | 0.0001 | 1.0 |
0.0186 | 3.03 | 2850 | 0.0001 | 1.0 |
0.0084 | 3.08 | 2900 | 0.0002 | 1.0 |
0.0051 | 3.13 | 2950 | 0.0003 | 1.0 |
0.0079 | 3.19 | 3000 | 0.0030 | 1.0 |
0.0065 | 3.24 | 3050 | 0.0002 | 1.0 |
0.0005 | 3.29 | 3100 | 0.0002 | 1.0 |
0.0036 | 3.35 | 3150 | 0.0000 | 1.0 |
0.014 | 3.4 | 3200 | 0.0001 | 1.0 |
0.0197 | 3.45 | 3250 | 0.0004 | 1.0 |
0.0316 | 3.51 | 3300 | 0.0000 | 1.0 |
0.015 | 3.56 | 3350 | 0.0001 | 1.0 |
0.0082 | 3.61 | 3400 | 0.0001 | 1.0 |
0.0237 | 3.67 | 3450 | 0.0003 | 1.0 |
0.0106 | 3.72 | 3500 | 0.0598 | 0.99 |
0.0006 | 3.77 | 3550 | 0.0698 | 0.99 |
0.0272 | 3.83 | 3600 | 0.0002 | 1.0 |
0.0261 | 3.88 | 3650 | 0.0002 | 1.0 |
0.0204 | 3.93 | 3700 | 0.0003 | 1.0 |
0.0002 | 3.99 | 3750 | 0.0001 | 1.0 |
0.0167 | 4.04 | 3800 | 0.0001 | 1.0 |
0.0001 | 4.09 | 3850 | 0.0001 | 1.0 |
0.0153 | 4.14 | 3900 | 0.0002 | 1.0 |
0.0002 | 4.2 | 3950 | 0.0000 | 1.0 |
0.0116 | 4.25 | 4000 | 0.0001 | 1.0 |
0.0001 | 4.3 | 4050 | 0.0000 | 1.0 |
0.0001 | 4.36 | 4100 | 0.0000 | 1.0 |
0.0002 | 4.41 | 4150 | 0.0000 | 1.0 |
0.0 | 4.46 | 4200 | 0.0000 | 1.0 |
0.0058 | 4.52 | 4250 | 0.0000 | 1.0 |
0.0003 | 4.57 | 4300 | 0.0000 | 1.0 |
0.0057 | 4.62 | 4350 | 0.0000 | 1.0 |
0.0139 | 4.68 | 4400 | 0.0298 | 0.99 |
0.0147 | 4.73 | 4450 | 0.0000 | 1.0 |
0.0074 | 4.78 | 4500 | 0.0000 | 1.0 |
0.0057 | 4.84 | 4550 | 0.0001 | 1.0 |
0.0111 | 4.89 | 4600 | 0.0000 | 1.0 |
0.0035 | 4.94 | 4650 | 0.0000 | 1.0 |
0.0003 | 4.99 | 4700 | 0.0000 | 1.0 |
0.0017 | 5.05 | 4750 | 0.0000 | 1.0 |
0.0227 | 5.1 | 4800 | 0.0000 | 1.0 |
0.0103 | 5.15 | 4850 | 0.0007 | 1.0 |
0.0123 | 5.21 | 4900 | 0.0000 | 1.0 |
0.0137 | 5.26 | 4950 | 0.0000 | 1.0 |
0.0104 | 5.31 | 5000 | 0.0001 | 1.0 |
0.0005 | 5.37 | 5050 | 0.0000 | 1.0 |
0.0248 | 5.42 | 5100 | 0.0288 | 0.99 |
0.0066 | 5.47 | 5150 | 0.0000 | 1.0 |
0.0082 | 5.53 | 5200 | 0.0000 | 1.0 |
0.0002 | 5.58 | 5250 | 0.0000 | 1.0 |
0.0128 | 5.63 | 5300 | 0.0000 | 1.0 |
0.0029 | 5.69 | 5350 | 0.0000 | 1.0 |
0.0119 | 5.74 | 5400 | 0.0000 | 1.0 |
0.0035 | 5.79 | 5450 | 0.0000 | 1.0 |
0.0002 | 5.84 | 5500 | 0.0000 | 1.0 |
0.0002 | 5.9 | 5550 | 0.0000 | 1.0 |
0.0 | 5.95 | 5600 | 0.0000 | 1.0 |
0.0036 | 6.0 | 5650 | 0.0000 | 1.0 |
0.0 | 6.06 | 5700 | 0.0000 | 1.0 |
0.0018 | 6.11 | 5750 | 0.0000 | 1.0 |
0.0082 | 6.16 | 5800 | 0.0000 | 1.0 |
0.0095 | 6.22 | 5850 | 0.0000 | 1.0 |
0.0135 | 6.27 | 5900 | 0.0749 | 0.99 |
0.001 | 6.32 | 5950 | 0.0000 | 1.0 |
0.0004 | 6.38 | 6000 | 0.0000 | 1.0 |
0.0 | 6.43 | 6050 | 0.0000 | 1.0 |
0.0046 | 6.48 | 6100 | 0.0000 | 1.0 |
0.0039 | 6.54 | 6150 | 0.0000 | 1.0 |
0.0028 | 6.59 | 6200 | 0.0000 | 1.0 |
0.0 | 6.64 | 6250 | 0.0000 | 1.0 |
0.0115 | 6.7 | 6300 | 0.0000 | 1.0 |
0.0047 | 6.75 | 6350 | 0.0000 | 1.0 |
0.0 | 6.8 | 6400 | 0.0000 | 1.0 |
0.0001 | 6.85 | 6450 | 0.0000 | 1.0 |
0.009 | 6.91 | 6500 | 0.0000 | 1.0 |
0.0005 | 6.96 | 6550 | 0.0001 | 1.0 |
0.0089 | 7.01 | 6600 | 0.0000 | 1.0 |
0.0002 | 7.07 | 6650 | 0.0003 | 1.0 |
0.0066 | 7.12 | 6700 | 0.0000 | 1.0 |
0.0002 | 7.17 | 6750 | 0.0000 | 1.0 |
0.0001 | 7.23 | 6800 | 0.0000 | 1.0 |
0.0001 | 7.28 | 6850 | 0.0000 | 1.0 |
0.0105 | 7.33 | 6900 | 0.0000 | 1.0 |
0.0056 | 7.39 | 6950 | 0.0000 | 1.0 |
0.0 | 7.44 | 7000 | 0.0000 | 1.0 |
0.0119 | 7.49 | 7050 | 0.0000 | 1.0 |
0.0 | 7.55 | 7100 | 0.0000 | 1.0 |
0.0051 | 7.6 | 7150 | 0.0000 | 1.0 |
0.0001 | 7.65 | 7200 | 0.0000 | 1.0 |
0.0 | 7.7 | 7250 | 0.0000 | 1.0 |
0.0 | 7.76 | 7300 | 0.0000 | 1.0 |
0.0034 | 7.81 | 7350 | 0.0000 | 1.0 |
0.0015 | 7.86 | 7400 | 0.0000 | 1.0 |
0.0 | 7.92 | 7450 | 0.0000 | 1.0 |
0.0018 | 7.97 | 7500 | 0.0000 | 1.0 |
0.0004 | 8.02 | 7550 | 0.0000 | 1.0 |
0.0004 | 8.08 | 7600 | 0.0000 | 1.0 |
0.0004 | 8.13 | 7650 | 0.0000 | 1.0 |
0.0002 | 8.18 | 7700 | 0.0000 | 1.0 |
0.0012 | 8.24 | 7750 | 0.0000 | 1.0 |
0.0009 | 8.29 | 7800 | 0.0000 | 1.0 |
0.0003 | 8.34 | 7850 | 0.0000 | 1.0 |
0.0044 | 8.4 | 7900 | 0.0000 | 1.0 |
0.0 | 8.45 | 7950 | 0.0000 | 1.0 |
0.0 | 8.5 | 8000 | 0.0000 | 1.0 |
0.0 | 8.55 | 8050 | 0.0000 | 1.0 |
0.0 | 8.61 | 8100 | 0.0000 | 1.0 |
0.0023 | 8.66 | 8150 | 0.0000 | 1.0 |
0.0 | 8.71 | 8200 | 0.0000 | 1.0 |
0.0006 | 8.77 | 8250 | 0.0000 | 1.0 |
0.0049 | 8.82 | 8300 | 0.0000 | 1.0 |
0.0012 | 8.87 | 8350 | 0.0000 | 1.0 |
0.0036 | 8.93 | 8400 | 0.0000 | 1.0 |
0.0004 | 8.98 | 8450 | 0.0000 | 1.0 |
0.0002 | 9.03 | 8500 | 0.0000 | 1.0 |
0.0003 | 9.09 | 8550 | 0.0000 | 1.0 |
0.0 | 9.14 | 8600 | 0.0000 | 1.0 |
0.0 | 9.19 | 8650 | 0.0000 | 1.0 |
0.0038 | 9.25 | 8700 | 0.0000 | 1.0 |
0.0001 | 9.3 | 8750 | 0.0000 | 1.0 |
0.0002 | 9.35 | 8800 | 0.0000 | 1.0 |
0.0 | 9.4 | 8850 | 0.0000 | 1.0 |
0.0 | 9.46 | 8900 | 0.0000 | 1.0 |
0.0035 | 9.51 | 8950 | 0.0000 | 1.0 |
0.0031 | 9.56 | 9000 | 0.0000 | 1.0 |
0.0002 | 9.62 | 9050 | 0.0000 | 1.0 |
0.0 | 9.67 | 9100 | 0.0000 | 1.0 |
0.0 | 9.72 | 9150 | 0.0000 | 1.0 |
0.0006 | 9.78 | 9200 | 0.0000 | 1.0 |
0.0013 | 9.83 | 9250 | 0.0000 | 1.0 |
0.0026 | 9.88 | 9300 | 0.0000 | 1.0 |
0.0001 | 9.94 | 9350 | 0.0000 | 1.0 |
0.0 | 9.99 | 9400 | 0.0000 | 1.0 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3