<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
kwav2vec-er-7-10-10000-long-16
This model is a fine-tuned version of facebook/wav2vec2-large-robust on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4867
- Accuracy: 0.8596
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.9453 | 0.09 | 300 | 1.9461 | 0.1429 |
1.7814 | 0.17 | 600 | 1.7560 | 0.2804 |
1.6983 | 0.26 | 900 | 1.6549 | 0.33 |
1.6332 | 0.34 | 1200 | 1.6167 | 0.35 |
1.4671 | 0.43 | 1500 | 1.4799 | 0.4219 |
1.3819 | 0.51 | 1800 | 1.4087 | 0.4621 |
1.4801 | 0.6 | 2100 | 1.3409 | 0.4834 |
1.3156 | 0.69 | 2400 | 1.2844 | 0.5129 |
1.2603 | 0.77 | 2700 | 1.3334 | 0.4906 |
1.338 | 0.86 | 3000 | 1.2715 | 0.5219 |
1.1885 | 0.94 | 3300 | 1.2114 | 0.557 |
1.1623 | 1.03 | 3600 | 1.1729 | 0.566 |
1.2912 | 1.11 | 3900 | 1.1854 | 0.5459 |
1.259 | 1.2 | 4200 | 1.1487 | 0.5757 |
1.0818 | 1.29 | 4500 | 1.1181 | 0.591 |
1.2465 | 1.37 | 4800 | 1.0425 | 0.6049 |
1.2297 | 1.46 | 5100 | 1.0993 | 0.5879 |
1.037 | 1.54 | 5400 | 1.0155 | 0.6141 |
0.9652 | 1.63 | 5700 | 0.9698 | 0.635 |
1.0562 | 1.71 | 6000 | 0.9940 | 0.6206 |
0.9646 | 1.8 | 6300 | 0.9526 | 0.6434 |
1.0492 | 1.89 | 6600 | 0.9540 | 0.6444 |
0.9303 | 1.97 | 6900 | 0.9079 | 0.6619 |
0.8979 | 2.06 | 7200 | 0.9684 | 0.6567 |
1.0744 | 2.14 | 7500 | 0.9537 | 0.6546 |
0.7805 | 2.23 | 7800 | 0.9472 | 0.659 |
0.9588 | 2.31 | 8100 | 0.8777 | 0.683 |
1.0313 | 2.4 | 8400 | 0.8792 | 0.6824 |
0.7308 | 2.49 | 8700 | 0.9444 | 0.6753 |
0.782 | 2.57 | 9000 | 0.7901 | 0.7159 |
0.5972 | 2.66 | 9300 | 0.7908 | 0.7179 |
0.9377 | 2.74 | 9600 | 0.7825 | 0.7186 |
0.7549 | 2.83 | 9900 | 0.7933 | 0.7166 |
0.9246 | 2.91 | 10200 | 0.8237 | 0.7059 |
0.723 | 3.0 | 10500 | 0.7964 | 0.7184 |
0.6826 | 3.09 | 10800 | 0.7832 | 0.7204 |
0.48 | 3.17 | 11100 | 0.7602 | 0.7326 |
0.9544 | 3.26 | 11400 | 0.7496 | 0.7356 |
0.6225 | 3.34 | 11700 | 0.7251 | 0.7416 |
0.6864 | 3.43 | 12000 | 0.6893 | 0.7477 |
0.6533 | 3.51 | 12300 | 0.7032 | 0.7449 |
0.6506 | 3.6 | 12600 | 0.6915 | 0.7531 |
0.719 | 3.69 | 12900 | 0.7114 | 0.7534 |
0.7465 | 3.77 | 13200 | 0.6245 | 0.7683 |
0.5865 | 3.86 | 13500 | 0.6307 | 0.7686 |
0.763 | 3.94 | 13800 | 0.6793 | 0.764 |
0.6104 | 4.03 | 14100 | 0.6804 | 0.7606 |
0.609 | 4.11 | 14400 | 0.6740 | 0.7663 |
0.4399 | 4.2 | 14700 | 0.7068 | 0.7687 |
0.5307 | 4.29 | 15000 | 0.6423 | 0.7751 |
0.5585 | 4.37 | 15300 | 0.6267 | 0.7829 |
0.5531 | 4.46 | 15600 | 0.5915 | 0.789 |
0.5812 | 4.54 | 15900 | 0.5750 | 0.7893 |
0.6432 | 4.63 | 16200 | 0.5935 | 0.7899 |
0.4807 | 4.71 | 16500 | 0.6042 | 0.7836 |
0.6689 | 4.8 | 16800 | 0.5938 | 0.7901 |
0.4702 | 4.89 | 17100 | 0.5929 | 0.7963 |
0.5037 | 4.97 | 17400 | 0.5796 | 0.8023 |
0.3308 | 5.06 | 17700 | 0.5446 | 0.8074 |
0.3866 | 5.14 | 18000 | 0.6131 | 0.7953 |
0.4683 | 5.23 | 18300 | 0.5761 | 0.8041 |
0.639 | 5.31 | 18600 | 0.5644 | 0.8009 |
0.4903 | 5.4 | 18900 | 0.6093 | 0.7934 |
0.3845 | 5.49 | 19200 | 0.5327 | 0.8139 |
0.4039 | 5.57 | 19500 | 0.6041 | 0.7983 |
0.5043 | 5.66 | 19800 | 0.5686 | 0.813 |
0.42 | 5.74 | 20100 | 0.5237 | 0.823 |
0.5379 | 5.83 | 20400 | 0.5690 | 0.8059 |
0.4729 | 5.91 | 20700 | 0.5232 | 0.8227 |
0.4964 | 6.0 | 21000 | 0.5268 | 0.8211 |
0.4058 | 6.09 | 21300 | 0.5601 | 0.8093 |
0.2957 | 6.17 | 21600 | 0.5131 | 0.8283 |
0.3008 | 6.26 | 21900 | 0.6069 | 0.8029 |
0.474 | 6.34 | 22200 | 0.5515 | 0.8177 |
0.3078 | 6.43 | 22500 | 0.5689 | 0.8187 |
0.361 | 6.51 | 22800 | 0.5430 | 0.8243 |
0.4732 | 6.6 | 23100 | 0.5465 | 0.8251 |
0.4066 | 6.69 | 23400 | 0.5511 | 0.8179 |
0.366 | 6.77 | 23700 | 0.5905 | 0.8173 |
0.3134 | 6.86 | 24000 | 0.5393 | 0.8254 |
0.3057 | 6.94 | 24300 | 0.5397 | 0.8261 |
0.3149 | 7.03 | 24600 | 0.5020 | 0.8319 |
0.2606 | 7.11 | 24900 | 0.4888 | 0.8439 |
0.3088 | 7.2 | 25200 | 0.6309 | 0.8099 |
0.2557 | 7.29 | 25500 | 0.5074 | 0.8406 |
0.3217 | 7.37 | 25800 | 0.5336 | 0.833 |
0.3753 | 7.46 | 26100 | 0.5225 | 0.8377 |
0.3168 | 7.54 | 26400 | 0.4932 | 0.8414 |
0.2784 | 7.63 | 26700 | 0.5103 | 0.8343 |
0.2869 | 7.71 | 27000 | 0.5633 | 0.8284 |
0.3714 | 7.8 | 27300 | 0.5300 | 0.8373 |
0.2503 | 7.89 | 27600 | 0.4961 | 0.8481 |
0.4205 | 7.97 | 27900 | 0.4790 | 0.8509 |
0.4278 | 8.06 | 28200 | 0.4962 | 0.8427 |
0.378 | 8.14 | 28500 | 0.5262 | 0.8416 |
0.4221 | 8.23 | 28800 | 0.5033 | 0.8481 |
0.4499 | 8.31 | 29100 | 0.5152 | 0.8446 |
0.269 | 8.4 | 29400 | 0.5285 | 0.8451 |
0.2446 | 8.49 | 29700 | 0.5129 | 0.8484 |
0.264 | 8.57 | 30000 | 0.5310 | 0.844 |
0.2721 | 8.66 | 30300 | 0.4810 | 0.8541 |
0.2618 | 8.74 | 30600 | 0.4979 | 0.8503 |
0.2559 | 8.83 | 30900 | 0.4783 | 0.8567 |
0.2825 | 8.91 | 31200 | 0.4857 | 0.8547 |
0.2502 | 9.0 | 31500 | 0.5035 | 0.853 |
0.2508 | 9.09 | 31800 | 0.5179 | 0.8501 |
0.3484 | 9.17 | 32100 | 0.4996 | 0.8559 |
0.2507 | 9.26 | 32400 | 0.4832 | 0.8549 |
0.346 | 9.34 | 32700 | 0.4954 | 0.8563 |
0.1667 | 9.43 | 33000 | 0.4910 | 0.8579 |
0.2654 | 9.51 | 33300 | 0.4944 | 0.8567 |
0.2123 | 9.6 | 33600 | 0.4905 | 0.857 |
0.2625 | 9.69 | 33900 | 0.4918 | 0.8596 |
0.2404 | 9.77 | 34200 | 0.4813 | 0.8593 |
0.2144 | 9.86 | 34500 | 0.4851 | 0.8619 |
0.3287 | 9.94 | 34800 | 0.4867 | 0.8596 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3