<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
bertbasecasedfinancialphrasebank
This model is a fine-tuned version of bert-base-cased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5046
- Accuracy: 0.8660
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.9921 | 0.04 | 5 | 0.9266 | 0.6082 |
0.8989 | 0.08 | 10 | 0.8833 | 0.6082 |
0.8563 | 0.12 | 15 | 0.8287 | 0.6371 |
0.8316 | 0.16 | 20 | 0.7626 | 0.6866 |
0.808 | 0.2 | 25 | 0.7398 | 0.6845 |
0.8219 | 0.25 | 30 | 0.7234 | 0.6969 |
0.7402 | 0.29 | 35 | 0.7080 | 0.6990 |
0.749 | 0.33 | 40 | 0.7261 | 0.6948 |
0.8777 | 0.37 | 45 | 0.7253 | 0.7072 |
0.7057 | 0.41 | 50 | 0.6736 | 0.7196 |
0.6968 | 0.45 | 55 | 0.6326 | 0.7320 |
0.6264 | 0.49 | 60 | 0.6121 | 0.7443 |
0.703 | 0.53 | 65 | 0.6008 | 0.7505 |
0.6788 | 0.57 | 70 | 0.6421 | 0.7278 |
0.6458 | 0.61 | 75 | 0.5801 | 0.7546 |
0.605 | 0.66 | 80 | 0.5601 | 0.7588 |
0.5428 | 0.7 | 85 | 0.5480 | 0.7546 |
0.6025 | 0.74 | 90 | 0.5632 | 0.7402 |
0.5761 | 0.78 | 95 | 0.5053 | 0.7794 |
0.4919 | 0.82 | 100 | 0.4745 | 0.8289 |
0.5217 | 0.86 | 105 | 0.4731 | 0.8268 |
0.5517 | 0.9 | 110 | 0.4512 | 0.8474 |
0.4131 | 0.94 | 115 | 0.4280 | 0.8392 |
0.5244 | 0.98 | 120 | 0.4222 | 0.8433 |
0.6407 | 1.02 | 125 | 0.4474 | 0.8392 |
0.3661 | 1.07 | 130 | 0.4101 | 0.8474 |
0.4148 | 1.11 | 135 | 0.3810 | 0.8619 |
0.296 | 1.15 | 140 | 0.4003 | 0.8371 |
0.3579 | 1.19 | 145 | 0.3738 | 0.8598 |
0.3718 | 1.23 | 150 | 0.3635 | 0.8598 |
0.307 | 1.27 | 155 | 0.3900 | 0.8515 |
0.3055 | 1.31 | 160 | 0.3881 | 0.8454 |
0.2789 | 1.35 | 165 | 0.3679 | 0.8495 |
0.2541 | 1.39 | 170 | 0.3709 | 0.8557 |
0.3263 | 1.43 | 175 | 0.3815 | 0.8557 |
0.305 | 1.48 | 180 | 0.3757 | 0.8495 |
0.2215 | 1.52 | 185 | 0.3643 | 0.8639 |
0.3642 | 1.56 | 190 | 0.3887 | 0.8495 |
0.4179 | 1.6 | 195 | 0.3736 | 0.8536 |
0.3126 | 1.64 | 200 | 0.3928 | 0.8392 |
0.2625 | 1.68 | 205 | 0.3648 | 0.8577 |
0.2155 | 1.72 | 210 | 0.3612 | 0.8577 |
0.2526 | 1.76 | 215 | 0.3781 | 0.8433 |
0.3033 | 1.8 | 220 | 0.4334 | 0.8412 |
0.2752 | 1.84 | 225 | 0.4039 | 0.8392 |
0.3466 | 1.89 | 230 | 0.3800 | 0.8412 |
0.4447 | 1.93 | 235 | 0.3575 | 0.8412 |
0.257 | 1.97 | 240 | 0.3732 | 0.8515 |
0.2647 | 2.01 | 245 | 0.3542 | 0.8454 |
0.1795 | 2.05 | 250 | 0.3599 | 0.8495 |
0.1697 | 2.09 | 255 | 0.3694 | 0.8515 |
0.1907 | 2.13 | 260 | 0.3985 | 0.8495 |
0.2515 | 2.17 | 265 | 0.3753 | 0.8660 |
0.1921 | 2.21 | 270 | 0.3674 | 0.8598 |
0.1342 | 2.25 | 275 | 0.3712 | 0.8577 |
0.1424 | 2.3 | 280 | 0.3595 | 0.8680 |
0.2071 | 2.34 | 285 | 0.3650 | 0.8680 |
0.2224 | 2.38 | 290 | 0.3881 | 0.8557 |
0.2556 | 2.42 | 295 | 0.3598 | 0.8536 |
0.1518 | 2.46 | 300 | 0.3506 | 0.8619 |
0.1427 | 2.5 | 305 | 0.3621 | 0.8639 |
0.1599 | 2.54 | 310 | 0.4495 | 0.8371 |
0.163 | 2.58 | 315 | 0.4812 | 0.8124 |
0.1462 | 2.62 | 320 | 0.4012 | 0.8680 |
0.2213 | 2.66 | 325 | 0.4047 | 0.8557 |
0.1967 | 2.7 | 330 | 0.4008 | 0.8701 |
0.2305 | 2.75 | 335 | 0.3838 | 0.8722 |
0.1097 | 2.79 | 340 | 0.3743 | 0.8577 |
0.1521 | 2.83 | 345 | 0.3700 | 0.8701 |
0.0833 | 2.87 | 350 | 0.3831 | 0.8680 |
0.2099 | 2.91 | 355 | 0.3947 | 0.8722 |
0.2086 | 2.95 | 360 | 0.4184 | 0.8515 |
0.1294 | 2.99 | 365 | 0.4214 | 0.8536 |
0.0992 | 3.03 | 370 | 0.4087 | 0.8639 |
0.0973 | 3.07 | 375 | 0.4073 | 0.8557 |
0.0519 | 3.11 | 380 | 0.4085 | 0.8639 |
0.1418 | 3.16 | 385 | 0.4109 | 0.8639 |
0.081 | 3.2 | 390 | 0.4426 | 0.8619 |
0.0506 | 3.24 | 395 | 0.4733 | 0.8454 |
0.0962 | 3.28 | 400 | 0.5100 | 0.8412 |
0.1045 | 3.32 | 405 | 0.4408 | 0.8639 |
0.1236 | 3.36 | 410 | 0.4395 | 0.8619 |
0.0962 | 3.4 | 415 | 0.4515 | 0.8557 |
0.1611 | 3.44 | 420 | 0.4819 | 0.8474 |
0.1044 | 3.48 | 425 | 0.4336 | 0.8598 |
0.0715 | 3.52 | 430 | 0.4574 | 0.8639 |
0.0302 | 3.57 | 435 | 0.4617 | 0.8619 |
0.1234 | 3.61 | 440 | 0.4710 | 0.8577 |
0.1439 | 3.65 | 445 | 0.5082 | 0.8433 |
0.1113 | 3.69 | 450 | 0.4596 | 0.8639 |
0.1336 | 3.73 | 455 | 0.4631 | 0.8598 |
0.072 | 3.77 | 460 | 0.4372 | 0.8660 |
0.0621 | 3.81 | 465 | 0.4381 | 0.8619 |
0.1703 | 3.85 | 470 | 0.4337 | 0.8680 |
0.0284 | 3.89 | 475 | 0.4321 | 0.8742 |
0.0647 | 3.93 | 480 | 0.4575 | 0.8680 |
0.0897 | 3.98 | 485 | 0.4676 | 0.8680 |
0.0646 | 4.02 | 490 | 0.4550 | 0.8722 |
0.0554 | 4.06 | 495 | 0.4721 | 0.8598 |
0.0259 | 4.1 | 500 | 0.5045 | 0.8454 |
0.0186 | 4.14 | 505 | 0.4635 | 0.8577 |
0.0391 | 4.18 | 510 | 0.5362 | 0.8639 |
0.0505 | 4.22 | 515 | 0.5122 | 0.8722 |
0.1101 | 4.26 | 520 | 0.5046 | 0.8660 |
Framework versions
- Transformers 4.21.1
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1