<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
whisper_4_with_init_sun__0100
This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.1327
- Train Accuracy: 0.0351
- Train Wermet: 0.0150
- Validation Loss: 1.3217
- Validation Accuracy: 0.0208
- Validation Wermet: 0.3240
- Epoch: 99
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
---|---|---|---|---|---|---|
5.3333 | 0.0111 | 1.3132 | 3.9675 | 0.0114 | 0.9339 | 0 |
4.7131 | 0.0116 | 0.8607 | 3.9360 | 0.0114 | 0.9503 | 1 |
4.6717 | 0.0117 | 0.8449 | 3.9196 | 0.0113 | 0.9768 | 2 |
4.6474 | 0.0117 | 0.8338 | 3.9039 | 0.0114 | 0.9557 | 3 |
4.6273 | 0.0118 | 0.8243 | 3.8721 | 0.0115 | 0.9414 | 4 |
4.6101 | 0.0118 | 0.8167 | 3.8629 | 0.0116 | 0.9156 | 5 |
4.5912 | 0.0119 | 0.7985 | 3.8361 | 0.0116 | 0.8988 | 6 |
4.5645 | 0.0120 | 0.7753 | 3.8298 | 0.0116 | 0.9045 | 7 |
4.5386 | 0.0121 | 0.7558 | 3.7904 | 0.0118 | 0.8426 | 8 |
4.5075 | 0.0122 | 0.7405 | 3.7472 | 0.0119 | 0.8103 | 9 |
4.4586 | 0.0124 | 0.7255 | 3.7163 | 0.0120 | 0.8189 | 10 |
4.3978 | 0.0126 | 0.7174 | 3.6168 | 0.0122 | 0.8163 | 11 |
4.3031 | 0.0128 | 0.7107 | 3.4956 | 0.0125 | 0.7847 | 12 |
4.1606 | 0.0133 | 0.7025 | 3.3414 | 0.0128 | 0.7897 | 13 |
3.9636 | 0.0138 | 0.6991 | 3.1311 | 0.0133 | 0.7495 | 14 |
3.7290 | 0.0145 | 0.6827 | 2.8892 | 0.0139 | 0.7292 | 15 |
3.4993 | 0.0152 | 0.6643 | 2.7195 | 0.0143 | 0.7129 | 16 |
3.2810 | 0.0159 | 0.6448 | 2.5418 | 0.0148 | 0.6803 | 17 |
3.0604 | 0.0167 | 0.6182 | 2.3572 | 0.0153 | 0.6538 | 18 |
2.8748 | 0.0174 | 0.5946 | 2.2575 | 0.0156 | 0.6337 | 19 |
2.6889 | 0.0181 | 0.5699 | 2.0988 | 0.0162 | 0.6016 | 20 |
2.5493 | 0.0187 | 0.5449 | 1.9878 | 0.0166 | 0.5834 | 21 |
2.3921 | 0.0194 | 0.5207 | 1.9029 | 0.0168 | 0.5597 | 22 |
2.2491 | 0.0201 | 0.4987 | 1.8642 | 0.0169 | 0.5409 | 23 |
2.1254 | 0.0207 | 0.4766 | 1.7354 | 0.0175 | 0.5231 | 24 |
1.9980 | 0.0213 | 0.4552 | 1.6661 | 0.0178 | 0.5049 | 25 |
1.9147 | 0.0217 | 0.4382 | 1.6140 | 0.0180 | 0.4921 | 26 |
1.8008 | 0.0223 | 0.4196 | 1.5652 | 0.0182 | 0.4742 | 27 |
1.7185 | 0.0228 | 0.4028 | 1.5159 | 0.0184 | 0.4632 | 28 |
1.6401 | 0.0232 | 0.3867 | 1.4891 | 0.0185 | 0.4548 | 29 |
1.5786 | 0.0235 | 0.3728 | 1.5141 | 0.0183 | 0.4548 | 30 |
1.4950 | 0.0241 | 0.3582 | 1.4345 | 0.0188 | 0.4340 | 31 |
1.4323 | 0.0244 | 0.3448 | 1.3694 | 0.0191 | 0.4226 | 32 |
1.3495 | 0.0250 | 0.3319 | 1.3780 | 0.0190 | 0.4172 | 33 |
1.3007 | 0.0253 | 0.3187 | 1.3296 | 0.0193 | 0.4109 | 34 |
1.2320 | 0.0257 | 0.3074 | 1.3116 | 0.0194 | 0.4029 | 35 |
1.1836 | 0.0261 | 0.2958 | 1.3025 | 0.0195 | 0.3992 | 36 |
1.1131 | 0.0266 | 0.2842 | 1.2885 | 0.0195 | 0.3894 | 37 |
1.0630 | 0.0269 | 0.2730 | 1.2627 | 0.0197 | 0.3850 | 38 |
1.0189 | 0.0272 | 0.2628 | 1.2633 | 0.0197 | 0.3822 | 39 |
1.0025 | 0.0273 | 0.2550 | 1.2561 | 0.0197 | 0.3760 | 40 |
0.9498 | 0.0277 | 0.2445 | 1.2288 | 0.0199 | 0.3710 | 41 |
0.9027 | 0.0281 | 0.2337 | 1.2188 | 0.0199 | 0.3684 | 42 |
0.8469 | 0.0286 | 0.2240 | 1.2072 | 0.0200 | 0.3637 | 43 |
0.8056 | 0.0289 | 0.2153 | 1.2046 | 0.0201 | 0.3599 | 44 |
0.7761 | 0.0291 | 0.2070 | 1.1989 | 0.0201 | 0.3579 | 45 |
0.7369 | 0.0295 | 0.1982 | 1.1938 | 0.0202 | 0.3528 | 46 |
0.7026 | 0.0298 | 0.1902 | 1.1934 | 0.0202 | 0.3508 | 47 |
0.6976 | 0.0298 | 0.1834 | 1.1803 | 0.0203 | 0.3469 | 48 |
0.6880 | 0.0298 | 0.1765 | 1.1844 | 0.0203 | 0.3470 | 49 |
0.6674 | 0.0300 | 0.1702 | 1.1741 | 0.0203 | 0.3446 | 50 |
0.6099 | 0.0305 | 0.1606 | 1.1753 | 0.0203 | 0.3440 | 51 |
0.5972 | 0.0306 | 0.1549 | 1.1692 | 0.0204 | 0.3401 | 52 |
0.5555 | 0.0310 | 0.1475 | 1.1744 | 0.0204 | 0.3382 | 53 |
0.5275 | 0.0313 | 0.1412 | 1.1743 | 0.0204 | 0.3384 | 54 |
0.5103 | 0.0315 | 0.1344 | 1.1720 | 0.0205 | 0.3355 | 55 |
0.5268 | 0.0313 | 0.1308 | 1.1709 | 0.0205 | 0.3343 | 56 |
0.5060 | 0.0315 | 0.1251 | 1.2090 | 0.0203 | 0.3318 | 57 |
0.4696 | 0.0318 | 0.1172 | 1.1748 | 0.0205 | 0.3321 | 58 |
0.4737 | 0.0318 | 0.1136 | 1.1764 | 0.0205 | 0.3313 | 59 |
0.4749 | 0.0318 | 0.1115 | 1.1684 | 0.0206 | 0.3289 | 60 |
0.4208 | 0.0323 | 0.1015 | 1.1704 | 0.0206 | 0.3275 | 61 |
0.3895 | 0.0326 | 0.0958 | 1.1777 | 0.0206 | 0.3286 | 62 |
0.3721 | 0.0328 | 0.0909 | 1.1754 | 0.0206 | 0.3267 | 63 |
0.4037 | 0.0324 | 0.0912 | 1.1798 | 0.0206 | 0.3272 | 64 |
0.3752 | 0.0327 | 0.0851 | 1.1840 | 0.0206 | 0.3268 | 65 |
0.3589 | 0.0329 | 0.0816 | 1.1819 | 0.0206 | 0.3258 | 66 |
0.3484 | 0.0330 | 0.0781 | 1.1825 | 0.0207 | 0.3261 | 67 |
0.3335 | 0.0332 | 0.0730 | 1.1890 | 0.0207 | 0.3269 | 68 |
0.2810 | 0.0337 | 0.0644 | 1.1991 | 0.0207 | 0.3237 | 69 |
0.2753 | 0.0338 | 0.0620 | 1.2133 | 0.0207 | 0.3251 | 70 |
0.3127 | 0.0334 | 0.0651 | 1.2154 | 0.0206 | 0.3269 | 71 |
0.2941 | 0.0336 | 0.0591 | 1.2063 | 0.0207 | 0.3245 | 72 |
0.2792 | 0.0337 | 0.0564 | 1.2073 | 0.0207 | 0.3253 | 73 |
0.2419 | 0.0341 | 0.0496 | 1.2113 | 0.0207 | 0.3247 | 74 |
0.2508 | 0.0340 | 0.0492 | 1.2158 | 0.0207 | 0.3244 | 75 |
0.2655 | 0.0339 | 0.0484 | 1.2171 | 0.0207 | 0.3240 | 76 |
0.2318 | 0.0342 | 0.0428 | 1.2267 | 0.0207 | 0.3263 | 77 |
0.2063 | 0.0345 | 0.0384 | 1.2268 | 0.0207 | 0.3263 | 78 |
0.2292 | 0.0343 | 0.0394 | 1.2309 | 0.0207 | 0.3242 | 79 |
0.1941 | 0.0346 | 0.0342 | 1.2372 | 0.0207 | 0.3264 | 80 |
0.2112 | 0.0345 | 0.0362 | 1.2365 | 0.0207 | 0.3270 | 81 |
0.1866 | 0.0347 | 0.0325 | 1.2414 | 0.0207 | 0.3244 | 82 |
0.1731 | 0.0348 | 0.0295 | 1.2579 | 0.0207 | 0.3324 | 83 |
0.1888 | 0.0346 | 0.0323 | 1.2551 | 0.0207 | 0.3284 | 84 |
0.1704 | 0.0348 | 0.0268 | 1.2662 | 0.0207 | 0.3270 | 85 |
0.1662 | 0.0348 | 0.0252 | 1.2662 | 0.0207 | 0.3259 | 86 |
0.2004 | 0.0346 | 0.0280 | 1.2666 | 0.0207 | 0.3260 | 87 |
0.1514 | 0.0350 | 0.0219 | 1.2709 | 0.0207 | 0.3265 | 88 |
0.1456 | 0.0350 | 0.0215 | 1.2787 | 0.0207 | 0.3280 | 89 |
0.1475 | 0.0351 | 0.0200 | 1.2800 | 0.0208 | 0.3270 | 90 |
0.1025 | 0.0355 | 0.0149 | 1.2893 | 0.0208 | 0.3260 | 91 |
0.1094 | 0.0355 | 0.0149 | 1.2986 | 0.0207 | 0.3272 | 92 |
0.0926 | 0.0356 | 0.0129 | 1.3074 | 0.0207 | 0.3246 | 93 |
0.1614 | 0.0350 | 0.0213 | 1.3017 | 0.0208 | 0.3251 | 94 |
0.1140 | 0.0353 | 0.0150 | 1.3106 | 0.0208 | 0.3264 | 95 |
0.1600 | 0.0348 | 0.0191 | 1.3071 | 0.0207 | 0.3277 | 96 |
0.1070 | 0.0354 | 0.0123 | 1.3157 | 0.0208 | 0.3249 | 97 |
0.0928 | 0.0356 | 0.0104 | 1.3235 | 0.0208 | 0.3281 | 98 |
0.1327 | 0.0351 | 0.0150 | 1.3217 | 0.0208 | 0.3240 | 99 |
Framework versions
- Transformers 4.34.0.dev0
- TensorFlow 2.13.0
- Tokenizers 0.13.3