<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
whisper_char_cv12_pad_lob100_low_sup__0060
This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0117
- Train Accuracy: 0.1114
- Train Wermet: 4.3648
- Validation Loss: 0.4241
- Validation Accuracy: 0.0636
- Validation Wermet: 10.1904
- Epoch: 59
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
---|---|---|---|---|---|---|
2.5942 | 0.0399 | 3.6402 | 1.9371 | 0.0319 | 16.1531 | 0 |
1.8766 | 0.0532 | 6.8384 | 1.7437 | 0.0343 | 15.0408 | 1 |
1.7251 | 0.0570 | 5.9150 | 1.6630 | 0.0358 | 10.5002 | 2 |
1.6457 | 0.0591 | 5.1153 | 1.5993 | 0.0369 | 10.4737 | 3 |
1.5935 | 0.0604 | 4.8231 | 1.5582 | 0.0375 | 8.5794 | 4 |
1.5526 | 0.0615 | 4.1987 | 1.5103 | 0.0385 | 9.4130 | 5 |
1.5165 | 0.0625 | 4.0179 | 1.4812 | 0.0391 | 6.6025 | 6 |
1.4868 | 0.0633 | 3.6770 | 1.4465 | 0.0399 | 6.7562 | 7 |
1.4565 | 0.0642 | 3.3851 | 1.4326 | 0.0402 | 6.3327 | 8 |
1.4271 | 0.0650 | 3.2883 | 1.3788 | 0.0413 | 6.5933 | 9 |
1.3965 | 0.0659 | 3.0822 | 1.3558 | 0.0415 | 5.7852 | 10 |
1.3541 | 0.0671 | 2.8659 | 1.2958 | 0.0429 | 5.2978 | 11 |
1.3066 | 0.0684 | 2.4942 | 1.2323 | 0.0440 | 4.9600 | 12 |
1.2401 | 0.0703 | 2.0745 | 1.1430 | 0.0456 | 3.6837 | 13 |
1.1549 | 0.0728 | 1.6202 | 1.0353 | 0.0478 | 2.9217 | 14 |
1.0653 | 0.0755 | 1.3041 | 0.9650 | 0.0492 | 2.0673 | 15 |
0.9765 | 0.0783 | 1.0922 | 0.8766 | 0.0510 | 2.7441 | 16 |
0.8977 | 0.0808 | 1.2561 | 0.8053 | 0.0524 | 3.6015 | 17 |
0.8246 | 0.0831 | 1.2955 | 0.7391 | 0.0537 | 3.2922 | 18 |
0.7591 | 0.0852 | 1.3109 | 0.7221 | 0.0541 | 3.6946 | 19 |
0.6988 | 0.0872 | 1.3303 | 0.6366 | 0.0559 | 3.8377 | 20 |
0.6424 | 0.0891 | 1.3256 | 0.5883 | 0.0569 | 4.1079 | 21 |
0.5925 | 0.0908 | 1.3637 | 0.5649 | 0.0575 | 3.7297 | 22 |
0.5405 | 0.0925 | 1.3142 | 0.5193 | 0.0584 | 3.5121 | 23 |
0.4929 | 0.0942 | 1.3157 | 0.4836 | 0.0591 | 4.8017 | 24 |
0.4523 | 0.0956 | 1.4635 | 0.4542 | 0.0598 | 4.5538 | 25 |
0.4116 | 0.0971 | 1.5118 | 0.4377 | 0.0602 | 4.9221 | 26 |
0.3759 | 0.0984 | 1.6392 | 0.4101 | 0.0608 | 5.6152 | 27 |
0.3446 | 0.0994 | 1.7744 | 0.3890 | 0.0613 | 7.0303 | 28 |
0.3176 | 0.1004 | 2.1998 | 0.3751 | 0.0616 | 8.1772 | 29 |
0.2945 | 0.1012 | 2.5525 | 0.3598 | 0.0619 | 8.2165 | 30 |
0.2739 | 0.1019 | 2.7708 | 0.3425 | 0.0623 | 9.8904 | 31 |
0.2553 | 0.1026 | 3.0620 | 0.3336 | 0.0625 | 9.8263 | 32 |
0.2380 | 0.1032 | 3.3150 | 0.3248 | 0.0627 | 10.1323 | 33 |
0.2225 | 0.1037 | 3.4188 | 0.3186 | 0.0629 | 9.8005 | 34 |
0.2074 | 0.1043 | 3.4245 | 0.3194 | 0.0629 | 10.0836 | 35 |
0.1921 | 0.1048 | 3.5998 | 0.3096 | 0.0631 | 10.9020 | 36 |
0.1795 | 0.1053 | 3.7938 | 0.3075 | 0.0632 | 11.1284 | 37 |
0.1671 | 0.1057 | 3.7413 | 0.3038 | 0.0633 | 10.9362 | 38 |
0.1546 | 0.1061 | 3.7830 | 0.3024 | 0.0634 | 10.7771 | 39 |
0.1432 | 0.1066 | 3.6808 | 0.3035 | 0.0635 | 11.4689 | 40 |
0.1319 | 0.1070 | 3.7824 | 0.3027 | 0.0635 | 10.9949 | 41 |
0.1211 | 0.1074 | 3.9301 | 0.3060 | 0.0636 | 10.8937 | 42 |
0.1113 | 0.1077 | 3.8509 | 0.3060 | 0.0636 | 10.7188 | 43 |
0.1012 | 0.1081 | 3.8780 | 0.3104 | 0.0636 | 10.6993 | 44 |
0.0922 | 0.1085 | 3.6982 | 0.3123 | 0.0637 | 10.6308 | 45 |
0.0827 | 0.1088 | 3.7227 | 0.3185 | 0.0637 | 10.8392 | 46 |
0.0741 | 0.1092 | 3.7235 | 0.3222 | 0.0637 | 10.2774 | 47 |
0.0665 | 0.1095 | 3.7106 | 0.3314 | 0.0637 | 9.5736 | 48 |
0.0589 | 0.1098 | 3.6104 | 0.3393 | 0.0636 | 9.9114 | 49 |
0.0515 | 0.1100 | 3.6150 | 0.3431 | 0.0637 | 10.1000 | 50 |
0.0453 | 0.1103 | 3.6760 | 0.3542 | 0.0636 | 9.4499 | 51 |
0.0389 | 0.1105 | 3.7376 | 0.3607 | 0.0636 | 9.6629 | 52 |
0.0335 | 0.1107 | 3.7707 | 0.3692 | 0.0637 | 9.5104 | 53 |
0.0283 | 0.1109 | 3.7655 | 0.3771 | 0.0636 | 9.6379 | 54 |
0.0246 | 0.1110 | 3.9511 | 0.3898 | 0.0636 | 9.7582 | 55 |
0.0211 | 0.1111 | 3.9487 | 0.3960 | 0.0636 | 10.0651 | 56 |
0.0191 | 0.1112 | 4.0695 | 0.4041 | 0.0636 | 9.1873 | 57 |
0.0150 | 0.1113 | 4.2329 | 0.4158 | 0.0636 | 10.5777 | 58 |
0.0117 | 0.1114 | 4.3648 | 0.4241 | 0.0636 | 10.1904 | 59 |
Framework versions
- Transformers 4.33.0.dev0
- TensorFlow 2.13.0
- Tokenizers 0.13.3