<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
whisper_charsplit_new_round3__0063
This model is a fine-tuned version of bigmorning/whisper_charsplit_new_round2__0061 on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0002
- Train Accuracy: 0.0795
- Train Wermet: 7.4915
- Validation Loss: 0.5613
- Validation Accuracy: 0.0772
- Validation Wermet: 6.6315
- Epoch: 62
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
---|---|---|---|---|---|---|
0.0009 | 0.0795 | 7.9492 | 0.5730 | 0.0769 | 7.2856 | 0 |
0.0015 | 0.0795 | 8.4221 | 0.5756 | 0.0769 | 7.1487 | 1 |
0.0012 | 0.0795 | 7.8476 | 0.5699 | 0.0769 | 6.5976 | 2 |
0.0010 | 0.0795 | 7.6843 | 0.5740 | 0.0769 | 6.9513 | 3 |
0.0014 | 0.0795 | 8.0796 | 0.5763 | 0.0768 | 7.4043 | 4 |
0.0019 | 0.0795 | 7.7274 | 0.5724 | 0.0769 | 6.4922 | 5 |
0.0008 | 0.0795 | 7.3468 | 0.5734 | 0.0769 | 6.1909 | 6 |
0.0009 | 0.0795 | 7.2393 | 0.5816 | 0.0769 | 6.5734 | 7 |
0.0010 | 0.0795 | 7.5822 | 0.5755 | 0.0769 | 6.6613 | 8 |
0.0004 | 0.0795 | 7.3807 | 0.5698 | 0.0770 | 7.0671 | 9 |
0.0001 | 0.0795 | 7.7157 | 0.5681 | 0.0771 | 6.8391 | 10 |
0.0001 | 0.0795 | 7.7540 | 0.5725 | 0.0771 | 6.9281 | 11 |
0.0001 | 0.0795 | 7.7721 | 0.5726 | 0.0771 | 6.8911 | 12 |
0.0000 | 0.0795 | 7.8163 | 0.5721 | 0.0771 | 6.8876 | 13 |
0.0000 | 0.0795 | 7.7745 | 0.5741 | 0.0771 | 6.8770 | 14 |
0.0000 | 0.0795 | 7.7277 | 0.5752 | 0.0771 | 6.8671 | 15 |
0.0000 | 0.0795 | 7.7355 | 0.5765 | 0.0771 | 6.8447 | 16 |
0.0000 | 0.0795 | 7.7109 | 0.5784 | 0.0771 | 6.8560 | 17 |
0.0000 | 0.0795 | 7.7427 | 0.5796 | 0.0771 | 6.8406 | 18 |
0.0003 | 0.0795 | 7.6709 | 0.6610 | 0.0762 | 7.0119 | 19 |
0.0115 | 0.0793 | 8.3288 | 0.5580 | 0.0769 | 7.1457 | 20 |
0.0013 | 0.0795 | 8.2537 | 0.5574 | 0.0770 | 6.7708 | 21 |
0.0004 | 0.0795 | 8.0507 | 0.5619 | 0.0770 | 7.0678 | 22 |
0.0003 | 0.0795 | 8.0534 | 0.5593 | 0.0771 | 7.0433 | 23 |
0.0002 | 0.0795 | 8.1738 | 0.5604 | 0.0771 | 7.1617 | 24 |
0.0001 | 0.0795 | 8.1494 | 0.5589 | 0.0771 | 7.1609 | 25 |
0.0000 | 0.0795 | 8.2151 | 0.5614 | 0.0771 | 7.1972 | 26 |
0.0000 | 0.0795 | 8.2332 | 0.5633 | 0.0771 | 7.1736 | 27 |
0.0000 | 0.0795 | 8.2573 | 0.5648 | 0.0771 | 7.2086 | 28 |
0.0000 | 0.0795 | 8.2571 | 0.5667 | 0.0771 | 7.1787 | 29 |
0.0000 | 0.0795 | 8.2607 | 0.5689 | 0.0771 | 7.2107 | 30 |
0.0000 | 0.0795 | 8.2992 | 0.5700 | 0.0772 | 7.2006 | 31 |
0.0000 | 0.0795 | 8.3059 | 0.5721 | 0.0772 | 7.2341 | 32 |
0.0000 | 0.0795 | 8.2872 | 0.5744 | 0.0772 | 7.2069 | 33 |
0.0080 | 0.0794 | 8.3693 | 0.5947 | 0.0762 | 7.3034 | 34 |
0.0063 | 0.0794 | 8.2517 | 0.5491 | 0.0769 | 7.1324 | 35 |
0.0008 | 0.0795 | 7.9115 | 0.5447 | 0.0771 | 6.9422 | 36 |
0.0002 | 0.0795 | 7.6265 | 0.5471 | 0.0771 | 6.8107 | 37 |
0.0001 | 0.0795 | 7.6685 | 0.5493 | 0.0771 | 6.6914 | 38 |
0.0001 | 0.0795 | 7.6100 | 0.5515 | 0.0771 | 6.7738 | 39 |
0.0000 | 0.0795 | 7.6623 | 0.5535 | 0.0771 | 6.7829 | 40 |
0.0000 | 0.0795 | 7.6768 | 0.5556 | 0.0771 | 6.8287 | 41 |
0.0000 | 0.0795 | 7.7199 | 0.5578 | 0.0772 | 6.8398 | 42 |
0.0000 | 0.0795 | 7.7423 | 0.5600 | 0.0772 | 6.8518 | 43 |
0.0000 | 0.0795 | 7.7561 | 0.5617 | 0.0772 | 6.8898 | 44 |
0.0000 | 0.0795 | 7.7766 | 0.5639 | 0.0772 | 6.8982 | 45 |
0.0000 | 0.0795 | 7.7962 | 0.5659 | 0.0772 | 6.9091 | 46 |
0.0000 | 0.0795 | 7.8106 | 0.5680 | 0.0772 | 6.9293 | 47 |
0.0000 | 0.0795 | 7.8387 | 0.5701 | 0.0772 | 6.9401 | 48 |
0.0000 | 0.0795 | 7.8480 | 0.5724 | 0.0772 | 6.9544 | 49 |
0.0000 | 0.0795 | 7.8755 | 0.5744 | 0.0772 | 6.9767 | 50 |
0.0000 | 0.0795 | 7.8924 | 0.5770 | 0.0772 | 6.9928 | 51 |
0.0000 | 0.0795 | 7.9169 | 0.5794 | 0.0772 | 7.0149 | 52 |
0.0000 | 0.0795 | 7.9400 | 0.5822 | 0.0772 | 7.0438 | 53 |
0.0000 | 0.0795 | 7.9697 | 0.5846 | 0.0772 | 7.0785 | 54 |
0.0000 | 0.0795 | 8.0061 | 0.5875 | 0.0772 | 7.0840 | 55 |
0.0000 | 0.0795 | 8.0364 | 0.5907 | 0.0772 | 7.0683 | 56 |
0.0113 | 0.0793 | 7.8674 | 0.5714 | 0.0768 | 6.0540 | 57 |
0.0030 | 0.0795 | 7.4853 | 0.5586 | 0.0770 | 6.6707 | 58 |
0.0009 | 0.0795 | 7.4969 | 0.5584 | 0.0771 | 6.7292 | 59 |
0.0004 | 0.0795 | 7.6676 | 0.5577 | 0.0771 | 6.7898 | 60 |
0.0002 | 0.0795 | 7.5238 | 0.5561 | 0.0772 | 6.6962 | 61 |
0.0002 | 0.0795 | 7.4915 | 0.5613 | 0.0772 | 6.6315 | 62 |
Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3