<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
whisper_char_cv12_pad_lob100_low_sup__0115
This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0019
- Train Accuracy: 0.1115
- Train Wermet: 5.8174
- Validation Loss: 0.5875
- Validation Accuracy: 0.0637
- Validation Wermet: 12.3093
- Epoch: 114
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
---|---|---|---|---|---|---|
2.5942 | 0.0399 | 3.6402 | 1.9371 | 0.0319 | 16.1531 | 0 |
1.8766 | 0.0532 | 6.8384 | 1.7437 | 0.0343 | 15.0408 | 1 |
1.7251 | 0.0570 | 5.9150 | 1.6630 | 0.0358 | 10.5002 | 2 |
1.6457 | 0.0591 | 5.1153 | 1.5993 | 0.0369 | 10.4737 | 3 |
1.5935 | 0.0604 | 4.8231 | 1.5582 | 0.0375 | 8.5794 | 4 |
1.5526 | 0.0615 | 4.1987 | 1.5103 | 0.0385 | 9.4130 | 5 |
1.5165 | 0.0625 | 4.0179 | 1.4812 | 0.0391 | 6.6025 | 6 |
1.4868 | 0.0633 | 3.6770 | 1.4465 | 0.0399 | 6.7562 | 7 |
1.4565 | 0.0642 | 3.3851 | 1.4326 | 0.0402 | 6.3327 | 8 |
1.4271 | 0.0650 | 3.2883 | 1.3788 | 0.0413 | 6.5933 | 9 |
1.3965 | 0.0659 | 3.0822 | 1.3558 | 0.0415 | 5.7852 | 10 |
1.3541 | 0.0671 | 2.8659 | 1.2958 | 0.0429 | 5.2978 | 11 |
1.3066 | 0.0684 | 2.4942 | 1.2323 | 0.0440 | 4.9600 | 12 |
1.2401 | 0.0703 | 2.0745 | 1.1430 | 0.0456 | 3.6837 | 13 |
1.1549 | 0.0728 | 1.6202 | 1.0353 | 0.0478 | 2.9217 | 14 |
1.0653 | 0.0755 | 1.3041 | 0.9650 | 0.0492 | 2.0673 | 15 |
0.9765 | 0.0783 | 1.0922 | 0.8766 | 0.0510 | 2.7441 | 16 |
0.8977 | 0.0808 | 1.2561 | 0.8053 | 0.0524 | 3.6015 | 17 |
0.8246 | 0.0831 | 1.2955 | 0.7391 | 0.0537 | 3.2922 | 18 |
0.7591 | 0.0852 | 1.3109 | 0.7221 | 0.0541 | 3.6946 | 19 |
0.6988 | 0.0872 | 1.3303 | 0.6366 | 0.0559 | 3.8377 | 20 |
0.6424 | 0.0891 | 1.3256 | 0.5883 | 0.0569 | 4.1079 | 21 |
0.5925 | 0.0908 | 1.3637 | 0.5649 | 0.0575 | 3.7297 | 22 |
0.5405 | 0.0925 | 1.3142 | 0.5193 | 0.0584 | 3.5121 | 23 |
0.4929 | 0.0942 | 1.3157 | 0.4836 | 0.0591 | 4.8017 | 24 |
0.4523 | 0.0956 | 1.4635 | 0.4542 | 0.0598 | 4.5538 | 25 |
0.4116 | 0.0971 | 1.5118 | 0.4377 | 0.0602 | 4.9221 | 26 |
0.3759 | 0.0984 | 1.6392 | 0.4101 | 0.0608 | 5.6152 | 27 |
0.3446 | 0.0994 | 1.7744 | 0.3890 | 0.0613 | 7.0303 | 28 |
0.3176 | 0.1004 | 2.1998 | 0.3751 | 0.0616 | 8.1772 | 29 |
0.2945 | 0.1012 | 2.5525 | 0.3598 | 0.0619 | 8.2165 | 30 |
0.2739 | 0.1019 | 2.7708 | 0.3425 | 0.0623 | 9.8904 | 31 |
0.2553 | 0.1026 | 3.0620 | 0.3336 | 0.0625 | 9.8263 | 32 |
0.2380 | 0.1032 | 3.3150 | 0.3248 | 0.0627 | 10.1323 | 33 |
0.2225 | 0.1037 | 3.4188 | 0.3186 | 0.0629 | 9.8005 | 34 |
0.2074 | 0.1043 | 3.4245 | 0.3194 | 0.0629 | 10.0836 | 35 |
0.1921 | 0.1048 | 3.5998 | 0.3096 | 0.0631 | 10.9020 | 36 |
0.1795 | 0.1053 | 3.7938 | 0.3075 | 0.0632 | 11.1284 | 37 |
0.1671 | 0.1057 | 3.7413 | 0.3038 | 0.0633 | 10.9362 | 38 |
0.1546 | 0.1061 | 3.7830 | 0.3024 | 0.0634 | 10.7771 | 39 |
0.1432 | 0.1066 | 3.6808 | 0.3035 | 0.0635 | 11.4689 | 40 |
0.1319 | 0.1070 | 3.7824 | 0.3027 | 0.0635 | 10.9949 | 41 |
0.1211 | 0.1074 | 3.9301 | 0.3060 | 0.0636 | 10.8937 | 42 |
0.1113 | 0.1077 | 3.8509 | 0.3060 | 0.0636 | 10.7188 | 43 |
0.1012 | 0.1081 | 3.8780 | 0.3104 | 0.0636 | 10.6993 | 44 |
0.0922 | 0.1085 | 3.6982 | 0.3123 | 0.0637 | 10.6308 | 45 |
0.0827 | 0.1088 | 3.7227 | 0.3185 | 0.0637 | 10.8392 | 46 |
0.0741 | 0.1092 | 3.7235 | 0.3222 | 0.0637 | 10.2774 | 47 |
0.0665 | 0.1095 | 3.7106 | 0.3314 | 0.0637 | 9.5736 | 48 |
0.0589 | 0.1098 | 3.6104 | 0.3393 | 0.0636 | 9.9114 | 49 |
0.0515 | 0.1100 | 3.6150 | 0.3431 | 0.0637 | 10.1000 | 50 |
0.0453 | 0.1103 | 3.6760 | 0.3542 | 0.0636 | 9.4499 | 51 |
0.0389 | 0.1105 | 3.7376 | 0.3607 | 0.0636 | 9.6629 | 52 |
0.0335 | 0.1107 | 3.7707 | 0.3692 | 0.0637 | 9.5104 | 53 |
0.0283 | 0.1109 | 3.7655 | 0.3771 | 0.0636 | 9.6379 | 54 |
0.0246 | 0.1110 | 3.9511 | 0.3898 | 0.0636 | 9.7582 | 55 |
0.0211 | 0.1111 | 3.9487 | 0.3960 | 0.0636 | 10.0651 | 56 |
0.0191 | 0.1112 | 4.0695 | 0.4041 | 0.0636 | 9.1873 | 57 |
0.0150 | 0.1113 | 4.2329 | 0.4158 | 0.0636 | 10.5777 | 58 |
0.0117 | 0.1114 | 4.3648 | 0.4241 | 0.0636 | 10.1904 | 59 |
0.0096 | 0.1115 | 4.3534 | 0.4333 | 0.0636 | 10.3831 | 60 |
0.0084 | 0.1115 | 4.4131 | 0.4417 | 0.0636 | 10.2134 | 61 |
0.0072 | 0.1115 | 4.4827 | 0.4539 | 0.0636 | 10.4537 | 62 |
0.0101 | 0.1114 | 4.6105 | 0.4701 | 0.0635 | 9.2620 | 63 |
0.0114 | 0.1113 | 4.4725 | 0.4602 | 0.0637 | 11.3443 | 64 |
0.0056 | 0.1115 | 4.6820 | 0.4678 | 0.0637 | 10.8401 | 65 |
0.0035 | 0.1115 | 4.7095 | 0.4748 | 0.0637 | 10.8410 | 66 |
0.0033 | 0.1115 | 4.5291 | 0.4831 | 0.0637 | 10.3950 | 67 |
0.0029 | 0.1115 | 4.4502 | 0.4916 | 0.0637 | 10.8216 | 68 |
0.0184 | 0.1110 | 4.2753 | 0.4987 | 0.0634 | 10.2126 | 69 |
0.0091 | 0.1113 | 4.1128 | 0.4833 | 0.0638 | 10.8605 | 70 |
0.0033 | 0.1115 | 4.1755 | 0.4911 | 0.0638 | 10.4538 | 71 |
0.0026 | 0.1115 | 4.3450 | 0.5009 | 0.0637 | 10.1961 | 72 |
0.0039 | 0.1115 | 4.6335 | 0.5079 | 0.0637 | 11.0165 | 73 |
0.0030 | 0.1115 | 4.5756 | 0.5071 | 0.0637 | 9.9384 | 74 |
0.0017 | 0.1115 | 4.6589 | 0.5090 | 0.0638 | 10.8814 | 75 |
0.0012 | 0.1115 | 4.8756 | 0.5146 | 0.0638 | 10.9099 | 76 |
0.0013 | 0.1115 | 4.9431 | 0.5220 | 0.0638 | 10.5558 | 77 |
0.0136 | 0.1111 | 4.8817 | 0.5117 | 0.0637 | 10.1668 | 78 |
0.0038 | 0.1115 | 5.1236 | 0.5118 | 0.0638 | 11.3651 | 79 |
0.0017 | 0.1115 | 5.3989 | 0.5176 | 0.0638 | 11.3609 | 80 |
0.0014 | 0.1115 | 5.5658 | 0.5231 | 0.0638 | 11.5637 | 81 |
0.0008 | 0.1115 | 5.4076 | 0.5273 | 0.0638 | 11.5293 | 82 |
0.0007 | 0.1116 | 5.5166 | 0.5325 | 0.0638 | 11.6874 | 83 |
0.0007 | 0.1115 | 5.3020 | 0.5370 | 0.0638 | 11.6410 | 84 |
0.0006 | 0.1116 | 5.3834 | 0.5424 | 0.0638 | 11.4686 | 85 |
0.0005 | 0.1115 | 5.2441 | 0.5482 | 0.0638 | 11.7770 | 86 |
0.0161 | 0.1110 | 5.8611 | 0.5310 | 0.0637 | 14.1541 | 87 |
0.0043 | 0.1115 | 6.7439 | 0.5302 | 0.0638 | 13.7884 | 88 |
0.0016 | 0.1115 | 6.4034 | 0.5337 | 0.0639 | 13.2969 | 89 |
0.0009 | 0.1115 | 6.4491 | 0.5361 | 0.0639 | 13.3960 | 90 |
0.0007 | 0.1115 | 6.4412 | 0.5412 | 0.0639 | 13.6544 | 91 |
0.0005 | 0.1115 | 6.4941 | 0.5451 | 0.0639 | 13.4296 | 92 |
0.0005 | 0.1116 | 6.4763 | 0.5493 | 0.0639 | 13.9268 | 93 |
0.0005 | 0.1115 | 6.4452 | 0.5595 | 0.0638 | 12.9971 | 94 |
0.0125 | 0.1111 | 5.7381 | 0.5505 | 0.0636 | 10.6493 | 95 |
0.0066 | 0.1114 | 5.3763 | 0.5383 | 0.0639 | 10.1229 | 96 |
0.0022 | 0.1115 | 5.4800 | 0.5424 | 0.0639 | 12.3926 | 97 |
0.0013 | 0.1115 | 5.6556 | 0.5460 | 0.0639 | 11.1784 | 98 |
0.0012 | 0.1115 | 6.1793 | 0.5467 | 0.0639 | 11.4956 | 99 |
0.0006 | 0.1115 | 6.0584 | 0.5492 | 0.0640 | 12.1496 | 100 |
0.0004 | 0.1116 | 5.8904 | 0.5531 | 0.0640 | 12.1934 | 101 |
0.0003 | 0.1116 | 5.8994 | 0.5566 | 0.0640 | 12.0296 | 102 |
0.0003 | 0.1116 | 5.8099 | 0.5608 | 0.0640 | 12.1687 | 103 |
0.0003 | 0.1116 | 5.8167 | 0.5641 | 0.0640 | 11.8858 | 104 |
0.0002 | 0.1116 | 5.7524 | 0.5681 | 0.0640 | 11.8685 | 105 |
0.0002 | 0.1116 | 5.8104 | 0.5731 | 0.0639 | 11.9771 | 106 |
0.0002 | 0.1116 | 5.7022 | 0.5770 | 0.0640 | 11.8855 | 107 |
0.0002 | 0.1116 | 5.8197 | 0.5806 | 0.0640 | 11.6167 | 108 |
0.0163 | 0.1109 | 5.0213 | 0.5551 | 0.0638 | 12.7567 | 109 |
0.0047 | 0.1114 | 5.9526 | 0.5517 | 0.0640 | 12.5943 | 110 |
0.0014 | 0.1115 | 6.1876 | 0.5544 | 0.0640 | 14.2314 | 111 |
0.0009 | 0.1115 | 6.4595 | 0.5571 | 0.0640 | 13.3475 | 112 |
0.0006 | 0.1115 | 5.5795 | 0.5598 | 0.0640 | 12.5131 | 113 |
0.0019 | 0.1115 | 5.8174 | 0.5875 | 0.0637 | 12.3093 | 114 |
Framework versions
- Transformers 4.33.0.dev0
- TensorFlow 2.13.0
- Tokenizers 0.13.3