generated_from_keras_callback

<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->

whisper_input_decoder_no_lob__0140

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.4122 0.0107 0.9328 3.9759 0.0114 0.9606 0
4.7176 0.0116 0.8683 3.9404 0.0114 0.9334 1
4.6750 0.0117 0.8478 3.9211 0.0115 0.9237 2
4.6511 0.0117 0.8413 3.8864 0.0115 0.9331 3
4.6294 0.0118 0.8270 3.8729 0.0115 0.9228 4
4.6134 0.0118 0.8199 3.8690 0.0114 0.9451 5
4.5980 0.0118 0.8102 3.8491 0.0115 0.9152 6
4.5759 0.0119 0.7890 3.8366 0.0116 0.8691 7
4.5518 0.0120 0.7694 3.8081 0.0116 0.9013 8
4.5219 0.0121 0.7591 3.7734 0.0118 0.8383 9
4.4761 0.0122 0.7400 3.7156 0.0120 0.8125 10
4.4139 0.0125 0.7257 3.6311 0.0121 0.8188 11
4.3113 0.0128 0.7127 3.5089 0.0124 0.8008 12
4.1608 0.0132 0.7088 3.3587 0.0127 0.7742 13
3.9595 0.0138 0.7012 3.1493 0.0132 0.7718 14
3.7188 0.0145 0.6820 2.8784 0.0139 0.7292 15
3.4775 0.0153 0.6678 2.6716 0.0144 0.7074 16
3.2575 0.0160 0.6481 2.4980 0.0149 0.6764 17
3.0615 0.0167 0.6314 2.3456 0.0153 0.6476 18
2.8715 0.0174 0.6094 2.2090 0.0158 0.6210 19
2.6930 0.0181 0.5931 2.0918 0.0162 0.5992 20
2.5383 0.0187 0.5739 1.9769 0.0166 0.5791 21
2.3952 0.0193 0.5512 1.9042 0.0168 0.5589 22
2.2427 0.0201 0.5333 1.8028 0.0172 0.5394 23
2.1236 0.0206 0.5174 1.7434 0.0174 0.5240 24
2.0315 0.0211 0.4978 1.6755 0.0177 0.5084 25
1.9066 0.0217 0.4773 1.6534 0.0178 0.4947 26
1.8279 0.0221 0.4596 1.5606 0.0182 0.4788 27
1.7325 0.0227 0.4412 1.5173 0.0184 0.4667 28
1.6416 0.0232 0.4199 1.4733 0.0186 0.4511 29
1.5702 0.0236 0.4028 1.4519 0.0187 0.4442 30
1.4787 0.0241 0.3839 1.4213 0.0188 0.4322 31
1.4238 0.0244 0.3700 1.3971 0.0190 0.4272 32
1.3561 0.0249 0.3594 1.3499 0.0192 0.4171 33
1.2828 0.0254 0.3431 1.3555 0.0192 0.4097 34
1.2318 0.0257 0.3277 1.3183 0.0194 0.4035 35
1.1668 0.0262 0.3201 1.3068 0.0195 0.3978 36
1.1571 0.0261 0.3105 1.2901 0.0195 0.3916 37
1.0812 0.0267 0.2989 1.2720 0.0197 0.3860 38
1.0134 0.0273 0.2863 1.2593 0.0197 0.3777 39
0.9986 0.0273 0.2769 1.2629 0.0198 0.3754 40
0.9322 0.0279 0.2653 1.2320 0.0199 0.3694 41
0.9021 0.0281 0.2552 1.2308 0.0200 0.3651 42
0.8583 0.0284 0.2444 1.2199 0.0200 0.3614 43
0.8101 0.0288 0.2355 1.2120 0.0200 0.3597 44
0.8045 0.0288 0.2299 1.2023 0.0201 0.3567 45
0.7823 0.0290 0.2213 1.2075 0.0201 0.3529 46
0.7186 0.0296 0.2107 1.1917 0.0202 0.3530 47
0.6949 0.0298 0.2028 1.1926 0.0202 0.3465 48
0.6669 0.0300 0.1943 1.1902 0.0203 0.3446 49
0.6125 0.0305 0.1842 1.1892 0.0203 0.3437 50
0.5926 0.0307 0.1778 1.2058 0.0203 0.3450 51
0.6055 0.0305 0.1738 1.1859 0.0203 0.3394 52
0.5828 0.0307 0.1653 1.1921 0.0203 0.3379 53
0.5507 0.0311 0.1569 1.1906 0.0204 0.3385 54
0.5050 0.0315 0.1485 1.1834 0.0205 0.3361 55
0.4878 0.0316 0.1447 1.1815 0.0205 0.3329 56
0.4825 0.0317 0.1410 1.2096 0.0204 0.3359 57
0.4987 0.0315 0.1374 1.2000 0.0204 0.3352 58
0.4576 0.0319 0.1305 1.1868 0.0205 0.3329 59
0.4185 0.0323 0.1215 1.2043 0.0205 0.3322 60
0.3889 0.0326 0.1156 1.1853 0.0206 0.3302 61
0.3790 0.0327 0.1101 1.2028 0.0205 0.3316 62
0.4072 0.0324 0.1110 1.2502 0.0203 0.3309 63
0.3519 0.0330 0.1020 1.1959 0.0206 0.3284 64
0.3861 0.0326 0.1034 1.1885 0.0206 0.3271 65
0.3789 0.0326 0.0961 1.1969 0.0206 0.3298 66
0.3233 0.0332 0.0905 1.1922 0.0207 0.3280 67
0.2956 0.0335 0.0854 1.2003 0.0207 0.3296 68
0.2666 0.0339 0.0796 1.2141 0.0207 0.3252 69
0.3181 0.0333 0.0813 1.2133 0.0207 0.3302 70
0.3032 0.0335 0.0770 1.2170 0.0207 0.3315 71
0.2746 0.0337 0.0741 1.2180 0.0207 0.3299 72
0.2549 0.0339 0.0705 1.2496 0.0206 0.3308 73
0.2529 0.0339 0.0685 1.2239 0.0207 0.3321 74
0.2427 0.0340 0.0671 1.2351 0.0207 0.3292 75
0.2166 0.0343 0.0623 1.2361 0.0207 0.3313 76
0.2030 0.0345 0.0585 1.2462 0.0207 0.3312 77
0.2126 0.0344 0.0566 1.2441 0.0207 0.3313 78
0.2166 0.0343 0.0569 1.2506 0.0207 0.3334 79
0.2088 0.0344 0.0562 1.2557 0.0207 0.3389 80
0.2212 0.0342 0.0560 1.2652 0.0207 0.3334 81
0.2256 0.0343 0.0543 1.2543 0.0207 0.3393 82
0.1915 0.0346 0.0501 1.2540 0.0207 0.3299 83
0.1544 0.0350 0.0443 1.2676 0.0207 0.3347 84
0.1567 0.0350 0.0435 1.2740 0.0207 0.3375 85
0.1329 0.0352 0.0405 1.2833 0.0207 0.3398 86
0.1261 0.0353 0.0392 1.3088 0.0206 0.3397 87
0.1547 0.0350 0.0438 1.2933 0.0207 0.3279 88
0.1288 0.0352 0.0377 1.2985 0.0208 0.3371 89
0.1400 0.0351 0.0391 1.3020 0.0208 0.3329 90
0.1077 0.0355 0.0351 1.3118 0.0207 0.3344 91
0.0982 0.0355 0.0348 1.3274 0.0207 0.3352 92
0.1041 0.0355 0.0356 1.3251 0.0208 0.3362 93
0.1573 0.0350 0.0364 1.3173 0.0208 0.3362 94
0.1478 0.0350 0.0365 1.3125 0.0208 0.3367 95
0.0929 0.0356 0.0336 1.3256 0.0208 0.3351 96
0.1106 0.0354 0.0341 1.3289 0.0208 0.3428 97
0.0916 0.0355 0.0322 1.3384 0.0208 0.3383 98
0.0708 0.0358 0.0313 1.3525 0.0208 0.3333 99
0.1271 0.0352 0.0326 1.3447 0.0208 0.3336 100
0.0963 0.0355 0.0329 1.3481 0.0208 0.3288 101
0.0698 0.0358 0.0294 1.3748 0.0207 0.3341 102
0.0844 0.0356 0.0313 1.3615 0.0208 0.3380 103
0.1045 0.0354 0.0319 1.3519 0.0208 0.3357 104
0.0877 0.0355 0.0326 1.3760 0.0207 0.3402 105
0.0713 0.0357 0.0303 1.3742 0.0208 0.3392 106
0.0686 0.0358 0.0292 1.3920 0.0207 0.3353 107
0.0466 0.0360 0.0287 1.3964 0.0208 0.3342 108
0.0353 0.0361 0.0258 1.3977 0.0208 0.3339 109
0.0369 0.0361 0.0269 1.4132 0.0208 0.3298 110
0.0601 0.0359 0.0304 1.4104 0.0208 0.3438 111
0.1244 0.0352 0.0332 1.3939 0.0207 0.3349 112
0.1102 0.0353 0.0301 1.3826 0.0208 0.3421 113
0.1199 0.0352 0.0311 1.3783 0.0208 0.3368 114
0.0576 0.0358 0.0281 1.3972 0.0208 0.3340 115
0.0363 0.0360 0.0255 1.3984 0.0208 0.3314 116
0.0260 0.0362 0.0260 1.4108 0.0208 0.3327 117
0.0291 0.0361 0.0265 1.4222 0.0208 0.3376 118
0.0252 0.0361 0.0249 1.4481 0.0208 0.3327 119
0.0855 0.0356 0.0283 1.4200 0.0208 0.3303 120
0.0947 0.0355 0.0287 1.4148 0.0208 0.3291 121
0.0561 0.0358 0.0250 1.4208 0.0208 0.3292 122
0.0675 0.0357 0.0272 1.4145 0.0208 0.3402 123
0.0340 0.0360 0.0276 1.4617 0.0207 0.3460 124
0.0213 0.0362 0.0285 1.4502 0.0208 0.3374 125
0.0369 0.0360 0.0287 1.4512 0.0208 0.3483 126
0.0717 0.0357 0.0310 1.4271 0.0208 0.3314 127
0.0434 0.0359 0.0242 1.4545 0.0208 0.3355 128
0.0541 0.0358 0.0281 1.4461 0.0208 0.3298 129
0.0281 0.0361 0.0265 1.4912 0.0207 0.3358 130
0.0438 0.0359 0.0260 1.4805 0.0208 0.3327 131
0.1140 0.0352 0.0304 1.4342 0.0208 0.3370 132
0.0611 0.0358 0.0330 1.4374 0.0209 0.3405 133
0.0319 0.0360 0.0320 1.4453 0.0209 0.3382 134
0.0208 0.0361 0.0290 1.4533 0.0209 0.3382 135
0.0242 0.0361 0.0311 1.4630 0.0209 0.3409 136
0.0185 0.0361 0.0315 1.4696 0.0209 0.3468 137
0.0416 0.0359 0.0294 1.4782 0.0209 0.3362 138
0.0844 0.0355 0.0333 1.4643 0.0208 0.3339 139

Framework versions