<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
whisper-tiny-cb
This model is a fine-tuned version of openai/whisper-tiny.en on the audiofolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.2940
- Wer: 0.1396
- Mer: 0.1346
- Wil: 0.2111
- Wip: 0.7889
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-06
- train_batch_size: 48
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 1500
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Mer | Wil | Wip |
---|---|---|---|---|---|---|---|
No log | 0.1 | 10 | 3.5619 | 0.4169 | 0.3723 | 0.5496 | 0.4504 |
No log | 0.21 | 20 | 3.5594 | 0.4169 | 0.3723 | 0.5496 | 0.4504 |
No log | 0.31 | 30 | 3.5458 | 0.4159 | 0.3714 | 0.5483 | 0.4517 |
No log | 0.41 | 40 | 3.5232 | 0.4130 | 0.3694 | 0.5466 | 0.4534 |
3.2935 | 0.52 | 50 | 3.4551 | 0.4120 | 0.3705 | 0.5481 | 0.4519 |
3.2935 | 0.62 | 60 | 3.3467 | 0.4110 | 0.3693 | 0.5459 | 0.4541 |
3.2935 | 0.72 | 70 | 3.2663 | 0.4130 | 0.3710 | 0.5485 | 0.4515 |
3.2935 | 0.82 | 80 | 3.1465 | 0.4100 | 0.3687 | 0.5455 | 0.4545 |
3.2935 | 0.93 | 90 | 3.0399 | 0.4031 | 0.3638 | 0.5396 | 0.4604 |
2.9491 | 1.03 | 100 | 2.9119 | 0.4002 | 0.3608 | 0.5345 | 0.4655 |
2.9491 | 1.13 | 110 | 2.8013 | 0.3982 | 0.3594 | 0.5332 | 0.4668 |
2.9491 | 1.24 | 120 | 2.6663 | 0.4051 | 0.3643 | 0.5379 | 0.4621 |
2.9491 | 1.34 | 130 | 2.5040 | 0.4022 | 0.3623 | 0.5341 | 0.4659 |
2.9491 | 1.44 | 140 | 2.3220 | 0.3972 | 0.3604 | 0.5254 | 0.4746 |
2.3522 | 1.55 | 150 | 2.1172 | 0.3638 | 0.3342 | 0.4918 | 0.5082 |
2.3522 | 1.65 | 160 | 1.8729 | 0.3609 | 0.3315 | 0.4867 | 0.5133 |
2.3522 | 1.75 | 170 | 1.6150 | 0.3707 | 0.3409 | 0.4917 | 0.5083 |
2.3522 | 1.86 | 180 | 1.3932 | 0.4464 | 0.4131 | 0.5460 | 0.4540 |
2.3522 | 1.96 | 190 | 1.2081 | 0.3658 | 0.3454 | 0.4773 | 0.5227 |
1.4367 | 2.06 | 200 | 0.9976 | 0.3333 | 0.3127 | 0.4403 | 0.5597 |
1.4367 | 2.16 | 210 | 0.8126 | 0.2852 | 0.2698 | 0.4117 | 0.5883 |
1.4367 | 2.27 | 220 | 0.6996 | 0.2753 | 0.2610 | 0.3962 | 0.6038 |
1.4367 | 2.37 | 230 | 0.6348 | 0.2380 | 0.2264 | 0.3540 | 0.6460 |
1.4367 | 2.47 | 240 | 0.5850 | 0.2262 | 0.2156 | 0.3414 | 0.6586 |
0.7217 | 2.58 | 250 | 0.5463 | 0.2153 | 0.2049 | 0.3260 | 0.6740 |
0.7217 | 2.68 | 260 | 0.5130 | 0.2055 | 0.1957 | 0.3103 | 0.6897 |
0.7217 | 2.78 | 270 | 0.4870 | 0.1898 | 0.1819 | 0.2911 | 0.7089 |
0.7217 | 2.89 | 280 | 0.4659 | 0.1809 | 0.1739 | 0.2792 | 0.7208 |
0.7217 | 2.99 | 290 | 0.4492 | 0.1819 | 0.1747 | 0.2778 | 0.7222 |
0.5116 | 3.09 | 300 | 0.4375 | 0.1770 | 0.1698 | 0.2706 | 0.7294 |
0.5116 | 3.2 | 310 | 0.4252 | 0.1799 | 0.1722 | 0.2720 | 0.7280 |
0.5116 | 3.3 | 320 | 0.4112 | 0.1780 | 0.1704 | 0.2704 | 0.7296 |
0.5116 | 3.4 | 330 | 0.3990 | 0.1721 | 0.1653 | 0.2626 | 0.7374 |
0.5116 | 3.51 | 340 | 0.3887 | 0.1583 | 0.1523 | 0.2417 | 0.7583 |
0.4371 | 3.61 | 350 | 0.3777 | 0.1593 | 0.1534 | 0.2429 | 0.7571 |
0.4371 | 3.71 | 360 | 0.3625 | 0.1563 | 0.1503 | 0.2366 | 0.7634 |
0.4371 | 3.81 | 370 | 0.3523 | 0.1504 | 0.1449 | 0.2283 | 0.7717 |
0.4371 | 3.92 | 380 | 0.3488 | 0.1485 | 0.1430 | 0.2256 | 0.7744 |
0.4371 | 4.02 | 390 | 0.3443 | 0.1475 | 0.1420 | 0.2239 | 0.7761 |
0.3618 | 4.12 | 400 | 0.3383 | 0.1465 | 0.1411 | 0.2230 | 0.7770 |
0.3618 | 4.23 | 410 | 0.3323 | 0.1455 | 0.1404 | 0.2209 | 0.7791 |
0.3618 | 4.33 | 420 | 0.3284 | 0.1514 | 0.1461 | 0.2297 | 0.7703 |
0.3618 | 4.43 | 430 | 0.3201 | 0.1475 | 0.1419 | 0.2215 | 0.7785 |
0.3618 | 4.54 | 440 | 0.3140 | 0.1436 | 0.1383 | 0.2163 | 0.7837 |
0.3404 | 4.64 | 450 | 0.3087 | 0.1436 | 0.1384 | 0.2158 | 0.7842 |
0.3404 | 4.74 | 460 | 0.3040 | 0.1386 | 0.1342 | 0.2095 | 0.7905 |
0.3404 | 4.85 | 470 | 0.3006 | 0.1347 | 0.1305 | 0.2050 | 0.7950 |
0.3404 | 4.95 | 480 | 0.2963 | 0.1347 | 0.1304 | 0.2040 | 0.7960 |
0.3404 | 5.05 | 490 | 0.2944 | 0.1426 | 0.1372 | 0.2136 | 0.7864 |
0.3062 | 5.15 | 500 | 0.2955 | 0.1396 | 0.1345 | 0.2094 | 0.7906 |
0.3062 | 5.26 | 510 | 0.2940 | 0.1396 | 0.1346 | 0.2111 | 0.7889 |
Framework versions
- Transformers 4.29.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3