generated_from_keras_callback

<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->

whisper_input_decoder_shift_r_labels_no_force__0090

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.6348 0.0091 1.5865 4.2935 0.0093 0.9579 0
4.9212 0.0099 0.9054 4.1262 0.0097 0.9390 1
4.6819 0.0107 0.8319 3.9071 0.0103 0.8966 2
4.4443 0.0114 0.8310 3.7367 0.0106 0.8939 3
4.2479 0.0119 0.8226 3.6101 0.0109 0.8696 4
4.0911 0.0124 0.8103 3.5364 0.0110 0.8946 5
3.9590 0.0127 0.7913 3.4556 0.0113 0.8388 6
3.8513 0.0130 0.7794 3.4106 0.0114 0.8515 7
3.7607 0.0133 0.7657 3.3507 0.0115 0.8261 8
3.6757 0.0136 0.7548 3.3141 0.0116 0.8400 9
3.6023 0.0138 0.7454 3.2711 0.0117 0.8006 10
3.5261 0.0140 0.7348 3.2391 0.0119 0.8101 11
3.4534 0.0143 0.7212 3.2070 0.0120 0.7870 12
3.3814 0.0146 0.7080 3.1505 0.0122 0.7826 13
3.3069 0.0148 0.6961 3.1102 0.0124 0.7609 14
3.2229 0.0152 0.6781 3.0542 0.0125 0.7532 15
3.1334 0.0156 0.6614 2.9840 0.0127 0.7448 16
3.0313 0.0160 0.6425 2.9032 0.0130 0.7123 17
2.9122 0.0166 0.6202 2.7986 0.0134 0.6930 18
2.7559 0.0173 0.5940 2.6337 0.0139 0.6673 19
2.5649 0.0182 0.5674 2.4490 0.0145 0.6383 20
2.3414 0.0193 0.5299 2.2785 0.0150 0.6183 21
2.0966 0.0206 0.4903 2.0460 0.0158 0.5649 22
1.8283 0.0220 0.4459 1.8369 0.0165 0.5306 23
1.5547 0.0235 0.3996 1.6356 0.0172 0.4848 24
1.3218 0.0249 0.3581 1.4682 0.0179 0.4510 25
1.1383 0.0260 0.3211 1.3465 0.0183 0.4226 26
0.9876 0.0270 0.2920 1.2323 0.0188 0.3966 27
0.8635 0.0278 0.2651 1.1482 0.0191 0.3749 28
0.7620 0.0284 0.2435 1.0816 0.0194 0.3565 29
0.6749 0.0290 0.2234 1.0187 0.0196 0.3433 30
0.5998 0.0295 0.2025 0.9761 0.0198 0.3319 31
0.5325 0.0300 0.1827 0.9326 0.0200 0.3213 32
0.4735 0.0305 0.1665 0.8942 0.0201 0.3110 33
0.4228 0.0308 0.1466 0.8735 0.0202 0.3026 34
0.3747 0.0312 0.1293 0.8408 0.0203 0.2931 35
0.3331 0.0316 0.1111 0.8253 0.0204 0.2891 36
0.2947 0.0319 0.0962 0.8084 0.0205 0.2849 37
0.2601 0.0322 0.0817 0.7906 0.0205 0.2783 38
0.2291 0.0324 0.0706 0.7876 0.0206 0.2755 39
0.2009 0.0327 0.0596 0.7723 0.0207 0.2712 40
0.1750 0.0329 0.0504 0.7629 0.0207 0.2692 41
0.1510 0.0331 0.0410 0.7650 0.0207 0.2684 42
0.1319 0.0333 0.0367 0.7533 0.0207 0.2655 43
0.1121 0.0335 0.0292 0.7589 0.0207 0.2647 44
0.0956 0.0336 0.0253 0.7579 0.0208 0.2642 45
0.0812 0.0337 0.0254 0.7584 0.0208 0.2625 46
0.0694 0.0338 0.0332 0.7555 0.0208 0.2693 47
0.0592 0.0339 0.0319 0.7534 0.0208 0.2629 48
0.0499 0.0339 0.0487 0.7587 0.0208 0.3030 49
0.0409 0.0339 0.0615 0.7577 0.0208 0.2810 50
0.0347 0.0340 0.0859 0.7603 0.0208 0.3534 51
0.0286 0.0340 0.1928 0.7554 0.0209 0.5822 52
0.0267 0.0340 0.3131 0.7664 0.0208 1.7372 53
0.0243 0.0340 1.3154 0.7525 0.0209 0.7770 54
0.0206 0.0340 0.8121 0.7532 0.0209 0.9253 55
0.0174 0.0340 0.9253 0.7574 0.0209 1.4865 56
0.0135 0.0340 1.1761 0.7592 0.0209 1.5813 57
0.0111 0.0340 1.7125 0.7631 0.0209 1.8950 58
0.0096 0.0340 1.9230 0.7664 0.0209 2.4432 59
0.0082 0.0340 2.5718 0.7693 0.0209 3.3565 60
0.0073 0.0340 3.5489 0.7747 0.0209 3.7191 61
0.0063 0.0340 3.7801 0.7756 0.0209 4.4728 62
0.0054 0.0340 4.0145 0.7795 0.0209 5.0058 63
0.0048 0.0340 4.9652 0.7821 0.0210 4.9937 64
0.0042 0.0340 5.5984 0.7914 0.0209 8.3869 65
0.0205 0.0339 9.9212 0.7811 0.0209 21.1156 66
0.0184 0.0339 8.3175 0.7619 0.0210 0.5360 67
0.0080 0.0340 0.6373 0.7554 0.0211 0.4090 68
0.0052 0.0340 0.5550 0.7528 0.0211 0.3938 69
0.0038 0.0340 0.4678 0.7551 0.0211 0.7911 70
0.0032 0.0340 1.1632 0.7617 0.0211 0.5495 71
0.0028 0.0340 0.7869 0.7643 0.0211 1.4089 72
0.0025 0.0340 1.5997 0.7681 0.0211 1.1413 73
0.0023 0.0340 1.7042 0.7719 0.0211 1.7576 74
0.0021 0.0340 2.3363 0.7750 0.0211 2.2434 75
0.0019 0.0340 2.9550 0.7777 0.0211 2.3071 76
0.0017 0.0340 3.1713 0.7831 0.0211 3.3338 77
0.0015 0.0340 3.9077 0.7852 0.0211 3.6442 78
0.0014 0.0340 4.3375 0.7900 0.0211 4.0113 79
0.0013 0.0340 4.9777 0.7946 0.0211 5.1689 80
0.0011 0.0340 5.9846 0.7968 0.0211 5.6006 81
0.0010 0.0340 6.6595 0.8033 0.0211 6.1998 82
0.0009 0.0340 7.3520 0.8058 0.0211 7.6034 83
0.0008 0.0340 8.1210 0.8138 0.0211 7.8284 84
0.0007 0.0340 8.9352 0.8170 0.0211 9.1346 85
0.0006 0.0340 10.2307 0.8185 0.0211 10.8739 86
0.0006 0.0340 12.2734 0.8245 0.0211 12.5682 87
0.0005 0.0340 13.1276 0.8314 0.0211 14.4535 88
0.0124 0.0339 14.3527 0.8265 0.0209 32.3895 89

Framework versions