generated_from_keras_callback

<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->

whisper_input_decoder_no_lob_with_force__0110

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.4192 0.0107 1.1095 3.9799 0.0114 0.9502 0
4.7193 0.0116 0.8751 3.9335 0.0114 0.9414 1
4.6743 0.0117 0.8498 3.9949 0.0112 0.9682 2
4.6508 0.0117 0.8454 3.8900 0.0114 0.9453 3
4.6308 0.0118 0.8337 3.8853 0.0114 0.9495 4
4.6102 0.0118 0.8215 3.8884 0.0115 0.9205 5
4.5940 0.0118 0.8132 3.8409 0.0116 0.9007 6
4.5703 0.0119 0.7971 3.8224 0.0116 0.9098 7
4.5470 0.0120 0.7822 3.8013 0.0116 0.8938 8
4.5219 0.0120 0.7679 3.7776 0.0117 0.8829 9
4.4859 0.0121 0.7519 3.7360 0.0118 0.8411 10
4.4408 0.0123 0.7412 3.6972 0.0118 0.8593 11
4.3774 0.0124 0.7240 3.6035 0.0121 0.8234 12
4.2906 0.0127 0.7168 3.5057 0.0123 0.8130 13
4.1748 0.0130 0.7090 3.3528 0.0127 0.7856 14
4.0214 0.0135 0.7048 3.2086 0.0130 0.7786 15
3.8434 0.0140 0.6918 3.0436 0.0134 0.7466 16
3.6564 0.0146 0.6797 2.8693 0.0138 0.7348 17
3.4565 0.0152 0.6658 2.6967 0.0143 0.7131 18
3.2849 0.0158 0.6496 2.5221 0.0148 0.6792 19
3.0761 0.0165 0.6273 2.3796 0.0153 0.6550 20
2.9131 0.0171 0.6028 2.2468 0.0156 0.6282 21
2.7468 0.0178 0.5812 2.1322 0.0160 0.6123 22
2.6133 0.0183 0.5606 2.1131 0.0160 0.5950 23
2.4732 0.0189 0.5367 2.0006 0.0164 0.5730 24
2.3339 0.0196 0.5164 1.8895 0.0168 0.5532 25
2.2300 0.0200 0.4985 1.8183 0.0172 0.5430 26
2.1215 0.0206 0.4787 1.7413 0.0174 0.5230 27
2.0359 0.0210 0.4608 1.6732 0.0177 0.5099 28
1.9219 0.0216 0.4433 1.6287 0.0179 0.4979 29
1.8477 0.0220 0.4274 1.6129 0.0180 0.4896 30
1.7778 0.0224 0.4123 1.5705 0.0182 0.4769 31
1.6924 0.0228 0.3980 1.5096 0.0185 0.4666 32
1.6389 0.0231 0.3847 1.4837 0.0186 0.4558 33
1.5687 0.0235 0.3710 1.4419 0.0188 0.4475 34
1.4742 0.0242 0.3566 1.4365 0.0188 0.4385 35
1.4321 0.0244 0.3452 1.4011 0.0190 0.4300 36
1.3712 0.0248 0.3328 1.4145 0.0189 0.4245 37
1.3249 0.0251 0.3231 1.3604 0.0192 0.4188 38
1.2787 0.0253 0.3113 1.3424 0.0193 0.4133 39
1.1978 0.0259 0.2997 1.3275 0.0194 0.4068 40
1.1643 0.0261 0.2893 1.3122 0.0195 0.4019 41
1.1043 0.0266 0.2785 1.3011 0.0195 0.3964 42
1.0828 0.0267 0.2712 1.3235 0.0195 0.3932 43
1.0231 0.0272 0.2601 1.2947 0.0196 0.3873 44
0.9817 0.0275 0.2511 1.2785 0.0198 0.3855 45
0.9612 0.0276 0.2430 1.2755 0.0198 0.3831 46
0.9191 0.0280 0.2330 1.2522 0.0199 0.3744 47
0.8629 0.0284 0.2234 1.2679 0.0199 0.3759 48
0.8454 0.0285 0.2159 1.2575 0.0199 0.3732 49
0.7928 0.0290 0.2064 1.2505 0.0200 0.3693 50
0.7814 0.0291 0.2003 1.2450 0.0200 0.3669 51
0.7775 0.0291 0.1948 1.2965 0.0198 0.3662 52
0.7274 0.0295 0.1858 1.2472 0.0201 0.3663 53
0.7132 0.0296 0.1787 1.2389 0.0201 0.3591 54
0.6941 0.0298 0.1731 1.2386 0.0201 0.3564 55
0.6327 0.0304 0.1633 1.2437 0.0201 0.3557 56
0.6090 0.0306 0.1569 1.2333 0.0202 0.3532 57
0.6347 0.0303 0.1561 1.2425 0.0202 0.3564 58
0.5639 0.0310 0.1447 1.2261 0.0202 0.3526 59
0.5470 0.0311 0.1383 1.2476 0.0202 0.3497 60
0.5209 0.0314 0.1324 1.2304 0.0203 0.3489 61
0.5326 0.0313 0.1300 1.2410 0.0203 0.3482 62
0.5106 0.0314 0.1236 1.2508 0.0203 0.3462 63
0.4984 0.0315 0.1197 1.2628 0.0202 0.3489 64
0.4676 0.0319 0.1129 1.2352 0.0204 0.3446 65
0.4612 0.0319 0.1091 1.2369 0.0204 0.3441 66
0.4367 0.0321 0.1030 1.2393 0.0204 0.3440 67
0.4165 0.0323 0.0984 1.2412 0.0204 0.3413 68
0.4154 0.0323 0.0955 1.2445 0.0204 0.3423 69
0.4130 0.0323 0.0922 1.2361 0.0205 0.3408 70
0.3758 0.0327 0.0857 1.2439 0.0204 0.3406 71
0.3566 0.0329 0.0812 1.2406 0.0205 0.3392 72
0.3232 0.0333 0.0750 1.2572 0.0205 0.3386 73
0.3755 0.0327 0.0776 1.2465 0.0205 0.3372 74
0.3485 0.0330 0.0733 1.2488 0.0205 0.3366 75
0.2972 0.0335 0.0651 1.2797 0.0204 0.3408 76
0.3179 0.0334 0.0649 1.2714 0.0205 0.3368 77
0.2738 0.0338 0.0579 1.2693 0.0205 0.3384 78
0.2658 0.0339 0.0544 1.2769 0.0205 0.3385 79
0.2337 0.0342 0.0493 1.2862 0.0205 0.3363 80
0.2667 0.0338 0.0513 1.2851 0.0205 0.3379 81
0.2639 0.0338 0.0488 1.2911 0.0205 0.3382 82
0.2682 0.0338 0.0498 1.2893 0.0205 0.3370 83
0.2325 0.0341 0.0442 1.2965 0.0205 0.3380 84
0.2534 0.0339 0.0452 1.2888 0.0205 0.3366 85
0.2503 0.0340 0.0429 1.3043 0.0205 0.3373 86
0.1962 0.0346 0.0367 1.3363 0.0205 0.3396 87
0.1735 0.0348 0.0329 1.3139 0.0205 0.3372 88
0.1664 0.0349 0.0316 1.3461 0.0204 0.3397 89
0.1891 0.0346 0.0335 1.3275 0.0206 0.3378 90
0.2370 0.0343 0.0369 1.3073 0.0206 0.3356 91
0.2429 0.0341 0.0380 1.3116 0.0205 0.3351 92
0.1635 0.0349 0.0272 1.3342 0.0205 0.3385 93
0.1326 0.0352 0.0235 1.3294 0.0206 0.3370 94
0.1322 0.0352 0.0224 1.3418 0.0206 0.3357 95
0.1357 0.0352 0.0224 1.3476 0.0206 0.3372 96
0.1394 0.0351 0.0230 1.3578 0.0206 0.3364 97
0.1370 0.0351 0.0224 1.3559 0.0206 0.3366 98
0.1700 0.0349 0.0259 1.3587 0.0205 0.3371 99
0.1102 0.0354 0.0183 1.3664 0.0206 0.3364 100
0.1039 0.0355 0.0175 1.4739 0.0203 0.3433 101
0.1094 0.0354 0.0184 1.3800 0.0206 0.3378 102
0.1426 0.0350 0.0203 1.3797 0.0206 0.3370 103
0.1132 0.0353 0.0175 1.4130 0.0205 0.3384 104
0.1029 0.0354 0.0160 1.4131 0.0205 0.3381 105
0.1241 0.0352 0.0183 1.3905 0.0206 0.3361 106
0.0869 0.0356 0.0133 1.4050 0.0206 0.3369 107
0.0805 0.0356 0.0138 1.4236 0.0206 0.3361 108
0.0842 0.0357 0.0154 1.4232 0.0206 0.3348 109

Framework versions