<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
whisper_syl_noforce__0060
This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0098
- Train Accuracy: 0.0362
- Train Wermet: 0.0020
- Validation Loss: 0.6751
- Validation Accuracy: 0.0233
- Validation Wermet: 0.2494
- Epoch: 59
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
---|---|---|---|---|---|---|
5.2961 | 0.0113 | 1.9043 | 3.9402 | 0.0116 | 0.9526 | 0 |
4.6207 | 0.0121 | 0.8740 | 3.7957 | 0.0120 | 0.9397 | 1 |
4.4142 | 0.0128 | 0.8473 | 3.6045 | 0.0124 | 0.8988 | 2 |
4.1915 | 0.0135 | 0.8361 | 3.4445 | 0.0128 | 0.9019 | 3 |
4.0072 | 0.0140 | 0.8260 | 3.3268 | 0.0131 | 0.8816 | 4 |
3.8559 | 0.0145 | 0.8084 | 3.2440 | 0.0133 | 0.8592 | 5 |
3.7359 | 0.0149 | 0.7986 | 3.1751 | 0.0135 | 0.8598 | 6 |
3.6368 | 0.0152 | 0.7891 | 3.1298 | 0.0136 | 0.8398 | 7 |
3.5465 | 0.0154 | 0.7775 | 3.0736 | 0.0138 | 0.8606 | 8 |
3.4710 | 0.0157 | 0.7681 | 3.0318 | 0.0138 | 0.8455 | 9 |
3.3988 | 0.0159 | 0.7603 | 3.0159 | 0.0139 | 0.8770 | 10 |
3.3279 | 0.0162 | 0.7504 | 2.9672 | 0.0141 | 0.8241 | 11 |
3.2611 | 0.0164 | 0.7397 | 2.9541 | 0.0141 | 0.8676 | 12 |
3.1996 | 0.0167 | 0.7284 | 2.8913 | 0.0144 | 0.7990 | 13 |
3.1311 | 0.0169 | 0.7162 | 2.8671 | 0.0145 | 0.7934 | 14 |
3.0590 | 0.0172 | 0.7044 | 2.8241 | 0.0146 | 0.7907 | 15 |
2.9692 | 0.0177 | 0.6843 | 2.7517 | 0.0149 | 0.7645 | 16 |
2.8783 | 0.0181 | 0.6630 | 2.6682 | 0.0152 | 0.7263 | 17 |
2.7622 | 0.0187 | 0.6417 | 2.5586 | 0.0156 | 0.7220 | 18 |
2.6164 | 0.0194 | 0.6138 | 2.4121 | 0.0161 | 0.6909 | 19 |
2.4405 | 0.0203 | 0.5838 | 2.2417 | 0.0167 | 0.6527 | 20 |
2.2404 | 0.0213 | 0.5486 | 2.1401 | 0.0170 | 0.6662 | 21 |
2.0196 | 0.0225 | 0.5086 | 1.8907 | 0.0180 | 0.5774 | 22 |
1.7917 | 0.0237 | 0.4665 | 1.7073 | 0.0186 | 0.5446 | 23 |
1.5286 | 0.0253 | 0.4182 | 1.5139 | 0.0194 | 0.4919 | 24 |
1.2991 | 0.0267 | 0.3736 | 1.3605 | 0.0200 | 0.4570 | 25 |
1.1117 | 0.0279 | 0.3336 | 1.2304 | 0.0205 | 0.4262 | 26 |
0.9643 | 0.0289 | 0.2986 | 1.1387 | 0.0209 | 0.4040 | 27 |
0.8404 | 0.0298 | 0.2663 | 1.0514 | 0.0213 | 0.3776 | 28 |
0.7408 | 0.0305 | 0.2408 | 0.9883 | 0.0216 | 0.3596 | 29 |
0.6542 | 0.0311 | 0.2155 | 0.9281 | 0.0218 | 0.3418 | 30 |
0.5800 | 0.0316 | 0.1936 | 0.8801 | 0.0221 | 0.3269 | 31 |
0.5168 | 0.0321 | 0.1737 | 0.8401 | 0.0222 | 0.3168 | 32 |
0.4595 | 0.0326 | 0.1552 | 0.8071 | 0.0224 | 0.3077 | 33 |
0.4080 | 0.0330 | 0.1375 | 0.7825 | 0.0225 | 0.2994 | 34 |
0.3646 | 0.0333 | 0.1225 | 0.7550 | 0.0226 | 0.2887 | 35 |
0.3234 | 0.0337 | 0.1095 | 0.7369 | 0.0227 | 0.2847 | 36 |
0.2878 | 0.0340 | 0.0950 | 0.7270 | 0.0228 | 0.2796 | 37 |
0.2542 | 0.0343 | 0.0823 | 0.7096 | 0.0229 | 0.2728 | 38 |
0.2238 | 0.0346 | 0.0718 | 0.6963 | 0.0229 | 0.2697 | 39 |
0.1974 | 0.0348 | 0.0609 | 0.6857 | 0.0230 | 0.2669 | 40 |
0.1714 | 0.0351 | 0.0500 | 0.6843 | 0.0230 | 0.2663 | 41 |
0.1488 | 0.0353 | 0.0411 | 0.6770 | 0.0230 | 0.2630 | 42 |
0.1296 | 0.0355 | 0.0339 | 0.6754 | 0.0231 | 0.2612 | 43 |
0.1117 | 0.0356 | 0.0270 | 0.6702 | 0.0231 | 0.2585 | 44 |
0.0954 | 0.0358 | 0.0211 | 0.6695 | 0.0231 | 0.2574 | 45 |
0.0822 | 0.0359 | 0.0163 | 0.6711 | 0.0231 | 0.2572 | 46 |
0.0715 | 0.0360 | 0.0137 | 0.6685 | 0.0231 | 0.2583 | 47 |
0.0591 | 0.0361 | 0.0093 | 0.6696 | 0.0231 | 0.2590 | 48 |
0.0494 | 0.0361 | 0.0068 | 0.6663 | 0.0232 | 0.2609 | 49 |
0.0412 | 0.0362 | 0.0051 | 0.6726 | 0.0231 | 0.2577 | 50 |
0.0343 | 0.0362 | 0.0042 | 0.6756 | 0.0232 | 0.2609 | 51 |
0.0287 | 0.0362 | 0.0031 | 0.6700 | 0.0232 | 0.2549 | 52 |
0.0245 | 0.0362 | 0.0035 | 0.6796 | 0.0232 | 0.2639 | 53 |
0.0297 | 0.0362 | 0.0054 | 0.6695 | 0.0232 | 0.2557 | 54 |
0.0249 | 0.0362 | 0.0039 | 0.6700 | 0.0232 | 0.2554 | 55 |
0.0177 | 0.0362 | 0.0026 | 0.6673 | 0.0233 | 0.2504 | 56 |
0.0138 | 0.0362 | 0.0023 | 0.6763 | 0.0232 | 0.2526 | 57 |
0.0114 | 0.0362 | 0.0020 | 0.6770 | 0.0232 | 0.2509 | 58 |
0.0098 | 0.0362 | 0.0020 | 0.6751 | 0.0233 | 0.2494 | 59 |
Framework versions
- Transformers 4.33.0.dev0
- TensorFlow 2.13.0
- Tokenizers 0.13.3