<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
wav2vec2-xlsr-ft-enc-cy
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the CUSTOM_COMMON_VOICE.PY - CY dataset. It achieves the following results on the evaluation set:
- Loss: 0.1049
- Wer: 0.0550
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 32
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 800
- num_epochs: 30.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
5.4163 | 0.34 | 400 | 1.0371 | 0.7924 |
0.5964 | 0.68 | 800 | 0.3492 | 0.3699 |
0.3473 | 1.01 | 1200 | 0.2285 | 0.2397 |
0.2526 | 1.35 | 1600 | 0.1932 | 0.1882 |
0.2275 | 1.69 | 2000 | 0.1749 | 0.1742 |
0.2028 | 2.03 | 2400 | 0.1674 | 0.1603 |
0.1624 | 2.37 | 2800 | 0.1548 | 0.1544 |
0.1565 | 2.71 | 3200 | 0.1505 | 0.1444 |
0.1487 | 3.04 | 3600 | 0.1565 | 0.1441 |
0.1298 | 3.38 | 4000 | 0.1472 | 0.1453 |
0.1268 | 3.72 | 4400 | 0.1349 | 0.1328 |
0.1197 | 4.06 | 4800 | 0.1344 | 0.1168 |
0.1081 | 4.4 | 5200 | 0.1344 | 0.1234 |
0.1104 | 4.74 | 5600 | 0.1269 | 0.1184 |
0.1046 | 5.07 | 6000 | 0.1317 | 0.1141 |
0.0925 | 5.41 | 6400 | 0.1321 | 0.1176 |
0.0924 | 5.75 | 6800 | 0.1230 | 0.1068 |
0.0889 | 6.09 | 7200 | 0.1322 | 0.1140 |
0.0808 | 6.43 | 7600 | 0.1174 | 0.1062 |
0.0821 | 6.77 | 8000 | 0.1300 | 0.1133 |
0.0788 | 7.1 | 8400 | 0.1148 | 0.0993 |
0.0711 | 7.44 | 8800 | 0.1157 | 0.0986 |
0.0739 | 7.78 | 9200 | 0.1178 | 0.0979 |
0.0683 | 8.12 | 9600 | 0.1175 | 0.0984 |
0.0651 | 8.46 | 10000 | 0.1085 | 0.0968 |
0.0657 | 8.79 | 10400 | 0.1180 | 0.0975 |
0.0638 | 9.13 | 10800 | 0.1164 | 0.1047 |
0.0598 | 9.47 | 11200 | 0.1182 | 0.0969 |
0.0591 | 9.81 | 11600 | 0.1095 | 0.0907 |
0.0579 | 10.15 | 12000 | 0.1155 | 0.0901 |
0.0551 | 10.49 | 12400 | 0.1083 | 0.0906 |
0.0532 | 10.82 | 12800 | 0.1123 | 0.0874 |
0.0508 | 11.16 | 13200 | 0.1137 | 0.0896 |
0.0506 | 11.5 | 13600 | 0.1115 | 0.0853 |
0.0502 | 11.84 | 14000 | 0.1136 | 0.0942 |
0.0478 | 12.18 | 14400 | 0.1076 | 0.0876 |
0.0464 | 12.52 | 14800 | 0.1095 | 0.0839 |
0.0455 | 12.85 | 15200 | 0.1060 | 0.0823 |
0.0427 | 13.19 | 15600 | 0.1114 | 0.0822 |
0.0427 | 13.53 | 16000 | 0.1086 | 0.0814 |
0.0441 | 13.87 | 16400 | 0.1046 | 0.0837 |
0.0417 | 14.21 | 16800 | 0.1197 | 0.0854 |
0.0401 | 14.55 | 17200 | 0.1133 | 0.0835 |
0.0399 | 14.88 | 17600 | 0.1086 | 0.0796 |
0.0376 | 15.22 | 18000 | 0.1090 | 0.0791 |
0.0378 | 15.56 | 18400 | 0.1084 | 0.0808 |
0.0366 | 15.9 | 18800 | 0.1104 | 0.0790 |
0.0361 | 16.24 | 19200 | 0.1130 | 0.0790 |
0.0337 | 16.58 | 19600 | 0.1044 | 0.0779 |
0.0339 | 16.91 | 20000 | 0.1037 | 0.0753 |
0.032 | 17.25 | 20400 | 0.1064 | 0.0765 |
0.0316 | 17.59 | 20800 | 0.1076 | 0.0743 |
0.032 | 17.93 | 21200 | 0.1086 | 0.0752 |
0.0294 | 18.27 | 21600 | 0.1066 | 0.0750 |
0.029 | 18.6 | 22000 | 0.1110 | 0.0723 |
0.03 | 18.94 | 22400 | 0.1061 | 0.0728 |
0.0286 | 19.28 | 22800 | 0.1090 | 0.0705 |
0.0272 | 19.62 | 23200 | 0.1064 | 0.0711 |
0.0275 | 19.96 | 23600 | 0.1055 | 0.0705 |
0.0255 | 20.3 | 24000 | 0.1078 | 0.0707 |
0.0253 | 20.63 | 24400 | 0.1121 | 0.0699 |
0.0249 | 20.97 | 24800 | 0.1060 | 0.0689 |
0.0239 | 21.31 | 25200 | 0.1087 | 0.0688 |
0.024 | 21.65 | 25600 | 0.1056 | 0.0693 |
0.024 | 21.99 | 26000 | 0.1049 | 0.0685 |
0.022 | 22.33 | 26400 | 0.1090 | 0.0666 |
0.021 | 22.66 | 26800 | 0.1049 | 0.0650 |
0.0204 | 23.0 | 27200 | 0.1032 | 0.0655 |
0.0205 | 23.34 | 27600 | 0.1145 | 0.0661 |
0.0203 | 23.68 | 28000 | 0.1121 | 0.0650 |
0.0198 | 24.02 | 28400 | 0.1094 | 0.0644 |
0.0191 | 24.36 | 28800 | 0.1063 | 0.0616 |
0.0191 | 24.69 | 29200 | 0.1080 | 0.0618 |
0.0185 | 25.03 | 29600 | 0.1055 | 0.0607 |
0.0179 | 25.37 | 30000 | 0.1080 | 0.0606 |
0.0176 | 25.71 | 30400 | 0.1054 | 0.0596 |
0.0171 | 26.05 | 30800 | 0.1069 | 0.0610 |
0.0162 | 26.38 | 31200 | 0.1096 | 0.0594 |
0.0158 | 26.72 | 31600 | 0.1050 | 0.0585 |
0.0154 | 27.06 | 32000 | 0.1009 | 0.0571 |
0.015 | 27.4 | 32400 | 0.1047 | 0.0570 |
0.0144 | 27.74 | 32800 | 0.1035 | 0.0567 |
0.0148 | 28.08 | 33200 | 0.1033 | 0.0569 |
0.0143 | 28.41 | 33600 | 0.1044 | 0.0570 |
0.014 | 28.75 | 34000 | 0.1035 | 0.0559 |
0.0133 | 29.09 | 34400 | 0.1050 | 0.0555 |
0.013 | 29.43 | 34800 | 0.1053 | 0.0552 |
0.0129 | 29.77 | 35200 | 0.1048 | 0.0550 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu117
- Datasets 2.11.0
- Tokenizers 0.13.3