generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

retrain5_oneTimeTraining_MTL-1epoch

This model is a fine-tuned version of alexziweiwang/exp21-uaspeech-foundation on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Acc Wer Correct Total Strlen
No log 0.02 5 13.9337 0.01 1.2925 2 200 200
12.4373 0.04 10 13.7513 0.08 1.5296 16 200 200
12.4373 0.06 15 13.5517 0.125 2.1126 25 200 200
12.6667 0.08 20 13.3400 0.165 2.5791 33 200 200
12.6667 0.11 25 13.1141 0.205 3.6561 41 200 200
11.1856 0.13 30 12.8805 0.22 2.7451 44 200 200
11.1856 0.15 35 12.6423 0.245 2.5178 49 200 200
10.6635 0.17 40 12.4028 0.27 2.4308 54 200 200
10.6635 0.19 45 12.1660 0.3 2.1818 60 200 200
10.7952 0.21 50 11.9291 0.305 1.9348 61 200 200
10.7952 0.23 55 11.6945 0.31 1.6858 62 200 200
10.3867 0.25 60 11.4608 0.315 1.5237 63 200 200
10.3867 0.27 65 11.2313 0.315 1.3953 63 200 200
10.252 0.3 70 11.0102 0.315 1.3162 63 200 200
10.252 0.32 75 10.7918 0.315 1.2826 63 200 200
10.1788 0.34 80 10.5736 0.315 1.2628 63 200 200
10.1788 0.36 85 10.3607 0.32 1.2391 64 200 200
9.1361 0.38 90 10.1527 0.31 1.2253 62 200 200
9.1361 0.4 95 9.9507 0.31 1.2036 62 200 200
9.5447 0.42 100 9.7553 0.315 1.2095 63 200 200
9.5447 0.44 105 9.5599 0.31 1.2016 62 200 200
9.1579 0.46 110 9.3711 0.295 1.1996 59 200 200
9.1579 0.48 115 9.1892 0.295 1.1897 59 200 200
7.9217 0.51 120 9.0143 0.3 1.1858 60 200 200
7.9217 0.53 125 8.8493 0.305 1.1719 61 200 200
8.4439 0.55 130 8.6946 0.305 1.1739 61 200 200
8.4439 0.57 135 8.5492 0.31 1.1581 62 200 200
8.0639 0.59 140 8.4153 0.315 1.1502 63 200 200
8.0639 0.61 145 8.2872 0.32 1.1482 64 200 200
8.4173 0.63 150 8.1649 0.33 1.1443 66 200 200
8.4173 0.65 155 8.0500 0.325 1.1403 65 200 200
7.8991 0.67 160 7.9422 0.33 1.1364 66 200 200
7.8991 0.7 165 7.8410 0.32 1.1344 64 200 200
6.9206 0.72 170 7.7469 0.32 1.1304 64 200 200
6.9206 0.74 175 7.6601 0.325 1.1285 65 200 200
7.1911 0.76 180 7.5832 0.305 1.1206 61 200 200
7.1911 0.78 185 7.5163 0.305 1.1225 61 200 200
7.201 0.8 190 7.4565 0.305 1.1245 61 200 200
7.201 0.82 195 7.4049 0.295 1.1245 59 200 200
7.1507 0.84 200 7.3568 0.295 1.1225 59 200 200
7.1507 0.86 205 7.3139 0.3 1.1206 60 200 200
6.6223 0.89 210 7.2774 0.295 1.1186 59 200 200
6.6223 0.91 215 7.2469 0.295 1.1186 59 200 200
7.1645 0.93 220 7.2220 0.295 1.1166 59 200 200
7.1645 0.95 225 7.2041 0.29 1.1146 58 200 200
6.2562 0.97 230 7.1921 0.29 1.1146 58 200 200
6.2562 0.99 235 7.1861 0.285 1.1126 57 200 200

Framework versions