generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

pure-start-epoch2

This model is a fine-tuned version of alexziweiwang/pure-start-epoch1 on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Acc Wer Correct Total Strlen
No log 0.01 2 20.4002 0.095 1.0 19 200 200
No log 0.02 4 19.9080 0.095 1.0 19 200 200
No log 0.03 6 19.4711 0.095 1.0 19 200 200
No log 0.03 8 19.1535 0.095 1.0 19 200 200
46.6007 0.04 10 18.6684 0.095 1.0 19 200 200
46.6007 0.05 12 18.1640 0.095 1.0 19 200 200
46.6007 0.06 14 17.6937 0.095 1.0 19 200 200
46.6007 0.07 16 17.2710 0.095 1.0 19 200 200
46.6007 0.08 18 16.8469 0.095 1.0 19 200 200
49.1547 0.08 20 16.4418 0.095 1.0 19 200 200
49.1547 0.09 22 16.0409 0.095 1.0 19 200 200
49.1547 0.1 24 15.6677 0.095 1.0 19 200 200
49.1547 0.11 26 15.3291 0.095 1.0 19 200 200
49.1547 0.12 28 15.0097 0.095 1.0 19 200 200
35.1416 0.13 30 14.6776 0.095 1.0 19 200 200
35.1416 0.13 32 14.3788 0.095 1.0 19 200 200
35.1416 0.14 34 14.0924 0.095 1.0 19 200 200
35.1416 0.15 36 13.8133 0.095 1.0 19 200 200
35.1416 0.16 38 13.5539 0.095 1.0 19 200 200
34.4057 0.17 40 13.3095 0.095 1.0 19 200 200
34.4057 0.18 42 13.0804 0.095 1.0 19 200 200
34.4057 0.19 44 12.8580 0.105 1.0 21 200 200
34.4057 0.19 46 12.6532 0.115 1.0 23 200 200
34.4057 0.2 48 12.4532 0.13 1.0 26 200 200
33.2759 0.21 50 12.2452 0.14 1.0 28 200 200
33.2759 0.22 52 12.0666 0.13 1.0 26 200 200
33.2759 0.23 54 11.8976 0.165 1.0 33 200 200
33.2759 0.24 56 11.7373 0.175 1.0 35 200 200
33.2759 0.24 58 11.5933 0.17 1.0 34 200 200
29.8129 0.25 60 11.4281 0.15 1.0 30 200 200
29.8129 0.26 62 11.2665 0.14 1.0 28 200 200
29.8129 0.27 64 11.1158 0.145 1.0 29 200 200
29.8129 0.28 66 10.9840 0.135 1.0 27 200 200
29.8129 0.29 68 10.8502 0.15 1.0 30 200 200
38.792 0.3 70 10.7341 0.15 1.0 30 200 200
38.792 0.3 72 10.6082 0.165 1.0 33 200 200
38.792 0.31 74 10.4944 0.18 1.0 36 200 200
38.792 0.32 76 10.3818 0.21 1.0 42 200 200
38.792 0.33 78 10.2719 0.235 1.0 47 200 200
28.0092 0.34 80 10.1636 0.235 1.0 47 200 200
28.0092 0.35 82 10.0709 0.24 1.0 48 200 200
28.0092 0.35 84 9.9797 0.24 1.0 48 200 200
28.0092 0.36 86 9.8958 0.24 1.0 48 200 200
28.0092 0.37 88 9.7977 0.24 1.0 48 200 200
21.6175 0.38 90 9.7015 0.24 1.0 48 200 200
21.6175 0.39 92 9.6150 0.24 1.0 48 200 200
21.6175 0.4 94 9.5304 0.24 1.0 48 200 200
21.6175 0.4 96 9.4521 0.24 1.0 48 200 200
21.6175 0.41 98 9.3832 0.24 1.0 48 200 200
26.3434 0.42 100 9.3148 0.24 1.0 48 200 200
26.3434 0.43 102 9.2563 0.24 1.0 48 200 200
26.3434 0.44 104 9.1944 0.24 1.0 48 200 200
26.3434 0.45 106 9.1323 0.24 1.0 48 200 200
26.3434 0.46 108 9.0717 0.24 1.0 48 200 200
24.4387 0.46 110 9.0245 0.24 1.0 48 200 200
24.4387 0.47 112 8.9772 0.24 1.0 48 200 200
24.4387 0.48 114 8.9390 0.24 1.0 48 200 200
24.4387 0.49 116 8.9013 0.24 1.0 48 200 200
24.4387 0.5 118 8.8605 0.24 1.0 48 200 200
21.7305 0.51 120 8.8126 0.24 1.0 48 200 200
21.7305 0.51 122 8.7503 0.24 1.0 48 200 200
21.7305 0.52 124 8.6921 0.24 1.0 48 200 200
21.7305 0.53 126 8.6378 0.24 1.0 48 200 200
21.7305 0.54 128 8.5927 0.24 1.0 48 200 200
21.5989 0.55 130 8.5520 0.24 1.0 48 200 200
21.5989 0.56 132 8.5126 0.24 1.0 48 200 200
21.5989 0.56 134 8.4743 0.24 1.0 48 200 200
21.5989 0.57 136 8.4369 0.24 1.0 48 200 200
21.5989 0.58 138 8.3993 0.24 1.0 48 200 200
21.8372 0.59 140 8.3636 0.24 1.0 48 200 200
21.8372 0.6 142 8.3311 0.24 1.0 48 200 200
21.8372 0.61 144 8.2983 0.24 1.0 48 200 200
21.8372 0.62 146 8.2652 0.24 1.0 48 200 200
21.8372 0.62 148 8.2345 0.24 1.0 48 200 200
20.1716 0.63 150 8.2064 0.24 1.0 48 200 200
20.1716 0.64 152 8.1818 0.24 1.0 48 200 200
20.1716 0.65 154 8.1603 0.24 1.0 48 200 200
20.1716 0.66 156 8.1403 0.24 1.0 48 200 200
20.1716 0.67 158 8.1180 0.24 1.0 48 200 200
24.5655 0.67 160 8.0997 0.24 1.0 48 200 200
24.5655 0.68 162 8.0791 0.24 1.0 48 200 200
24.5655 0.69 164 8.0563 0.24 1.0 48 200 200
24.5655 0.7 166 8.0342 0.24 1.0 48 200 200
24.5655 0.71 168 8.0130 0.24 1.0 48 200 200
17.3768 0.72 170 7.9936 0.24 1.0 48 200 200
17.3768 0.72 172 7.9756 0.24 1.0 48 200 200
17.3768 0.73 174 7.9594 0.24 1.0 48 200 200
17.3768 0.74 176 7.9439 0.24 1.0 48 200 200
17.3768 0.75 178 7.9298 0.24 1.0 48 200 200
19.7473 0.76 180 7.9157 0.24 1.0 48 200 200
19.7473 0.77 182 7.9021 0.24 1.0 48 200 200
19.7473 0.78 184 7.8899 0.24 1.0 48 200 200
19.7473 0.78 186 7.8796 0.24 1.0 48 200 200
19.7473 0.79 188 7.8697 0.24 1.0 48 200 200
15.7279 0.8 190 7.8598 0.24 1.0 48 200 200
15.7279 0.81 192 7.8490 0.24 1.0 48 200 200
15.7279 0.82 194 7.8390 0.24 1.0 48 200 200
15.7279 0.83 196 7.8293 0.24 1.0 48 200 200
15.7279 0.83 198 7.8211 0.24 1.0 48 200 200
18.5034 0.84 200 7.8135 0.24 1.0 48 200 200
18.5034 0.85 202 7.8064 0.24 1.0 48 200 200
18.5034 0.86 204 7.7991 0.24 1.0 48 200 200
18.5034 0.87 206 7.7924 0.24 1.0 48 200 200
18.5034 0.88 208 7.7862 0.24 1.0 48 200 200
17.1983 0.89 210 7.7803 0.24 1.0 48 200 200
17.1983 0.89 212 7.7749 0.24 1.0 48 200 200
17.1983 0.9 214 7.7701 0.24 1.0 48 200 200
17.1983 0.91 216 7.7657 0.24 1.0 48 200 200
17.1983 0.92 218 7.7628 0.24 1.0 48 200 200
17.7276 0.93 220 7.7595 0.24 1.0 48 200 200
17.7276 0.94 222 7.7567 0.24 1.0 48 200 200
17.7276 0.94 224 7.7541 0.24 1.0 48 200 200
17.7276 0.95 226 7.7518 0.24 1.0 48 200 200
17.7276 0.96 228 7.7497 0.24 1.0 48 200 200
17.8692 0.97 230 7.7479 0.24 1.0 48 200 200
17.8692 0.98 232 7.7463 0.24 1.0 48 200 200
17.8692 0.99 234 7.7453 0.24 1.0 48 200 200
17.8692 0.99 236 7.7447 0.24 1.0 48 200 200

Framework versions