generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

comp_4070

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Cer
46.3138 0.31 100 52.4882 1.0565
46.8456 0.61 200 52.3881 1.0195
45.1111 0.92 300 52.2131 1.0098
42.1887 1.22 400 51.9607 1.0074
39.8973 1.53 500 51.6479 1.0044
40.3098 1.83 600 51.2534 0.9954
43.0428 2.14 700 50.7402 1.0144
41.0924 2.45 800 49.0088 0.9957
23.9534 2.75 900 22.1701 0.9957
13.6958 3.06 1000 13.5678 0.9957
10.6346 3.36 1100 10.6896 0.9957
9.2081 3.67 1200 9.2513 0.9957
8.1931 3.98 1300 8.3136 0.9957
7.411 4.28 1400 7.6162 0.9957
7.0001 4.59 1500 7.0569 0.9957
6.4564 4.89 1600 6.6027 0.9957
5.9763 5.2 1700 6.2126 0.9957
5.7513 5.5 1800 5.8822 0.9957
5.3524 5.81 1900 5.5997 0.9957
5.1859 6.12 2000 5.3638 0.9957
5.0409 6.42 2100 5.1582 0.9957
4.8325 6.73 2200 4.9859 0.9957
4.7159 7.03 2300 4.8386 0.9957
4.5141 7.34 2400 4.7151 0.9957
4.5168 7.65 2500 4.6124 0.9957
4.391 7.95 2600 4.5273 0.9957
4.3214 8.26 2700 4.4586 0.9957
4.2589 8.56 2800 4.4045 0.9957
4.2121 8.87 2900 4.3628 0.9957
4.1945 9.17 3000 4.3291 0.9957
4.1998 9.48 3100 4.3051 0.9957
4.1552 9.79 3200 4.2893 0.9957
4.1489 10.09 3300 4.2744 0.9957
4.1448 10.4 3400 4.2620 0.9957
4.1337 10.7 3500 4.2535 0.9957
4.0886 11.01 3600 4.2446 0.9957
4.1142 11.31 3700 4.2376 0.9957
4.0867 11.62 3800 4.2323 0.9957
4.0829 11.93 3900 4.2275 0.9957
4.1394 12.23 4000 4.2231 0.9957
4.1104 12.54 4100 4.2181 0.9957
4.1217 12.84 4200 4.2162 0.9957
4.0601 13.15 4300 4.2118 0.9957
4.0797 13.46 4400 4.2086 0.9957
4.0236 13.76 4500 4.2027 0.9957
4.1162 14.07 4600 4.1970 0.9957
4.1238 14.37 4700 4.1884 0.9957
4.1031 14.68 4800 4.1753 0.9957
4.1089 14.98 4900 4.1638 0.9957
4.02 15.29 5000 4.1517 0.9957
3.9624 15.6 5100 4.1411 0.9957
3.9731 15.9 5200 4.1312 0.9957
4.0816 16.21 5300 4.1222 0.9957
4.0718 16.51 5400 4.1139 0.9957
4.0561 16.82 5500 4.1086 0.9957
3.9548 17.13 5600 4.1050 0.9957
3.972 17.43 5700 4.0987 0.9957
3.9433 17.74 5800 4.0931 0.9957
4.0043 18.04 5900 4.0871 0.9957
4.0505 18.35 6000 4.0863 0.9957
4.0064 18.65 6100 4.0859 0.9957
3.9995 18.96 6200 4.0803 0.9957
3.9386 19.27 6300 4.0712 0.9957
3.9644 19.57 6400 4.0683 0.9957
3.9286 19.88 6500 4.0667 0.9957
3.9873 20.18 6600 4.0676 0.9957
3.9687 20.49 6700 4.0658 0.9957
3.9949 20.8 6800 4.0623 0.9957
3.9831 21.1 6900 4.0570 0.9957
3.9596 21.41 7000 4.0561 0.9957
3.9585 21.71 7100 4.0533 0.9957
3.97 22.02 7200 4.0537 0.9957
3.9665 22.32 7300 4.0550 0.9957
3.9263 22.63 7400 4.0547 0.9957
3.9257 22.94 7500 4.0538 0.9957
3.9945 23.24 7600 4.0493 0.9957
3.9555 23.55 7700 4.0479 0.9957
3.9997 23.85 7800 4.0472 0.9957
3.8985 24.16 7900 4.0477 0.9957
3.9051 24.46 8000 4.0449 0.9957
3.8753 24.77 8100 4.0449 0.9957
4.0476 25.08 8200 4.0437 0.9957
4.0331 25.38 8300 4.0447 0.9957
4.0025 25.69 8400 4.0442 0.9957
3.9211 25.99 8500 4.0440 0.9957
3.8675 26.3 8600 4.0429 0.9957
3.9078 26.61 8700 4.0447 0.9957
3.8914 26.91 8800 4.0428 0.9957
4.0395 27.22 8900 4.0421 0.9957
4.006 27.52 9000 4.0425 0.9957
4.0034 27.83 9100 4.0417 0.9957
3.89 28.13 9200 4.0419 0.9957
3.8887 28.44 9300 4.0423 0.9957
3.8958 28.75 9400 4.0422 0.9957
3.9894 29.05 9500 4.0419 0.9957
4.0323 29.36 9600 4.0419 0.9957
3.9905 29.66 9700 4.0419 0.9957
3.9555 29.97 9800 4.0419 0.9957

Framework versions