<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
asd_pron_w2v_acc_balanced_500_79_converse
This model is a fine-tuned version of slplab/wav2vec2-xls-r-300m_phone-mfa_korean on the None dataset. It achieves the following results on the evaluation set:
- Loss: 5.5991
- Accuracy: 0.4400
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.0316 | 1.0 | 354 | 1.5232 | 0.4617 |
0.2701 | 2.0 | 708 | 2.5421 | 0.5117 |
0.0942 | 3.0 | 1062 | 3.5480 | 0.4100 |
0.0405 | 4.0 | 1416 | 3.8218 | 0.4550 |
0.025 | 5.0 | 1770 | 4.1046 | 0.4100 |
0.0146 | 6.0 | 2124 | 4.6770 | 0.3917 |
0.0139 | 7.0 | 2478 | 4.4233 | 0.4117 |
0.0106 | 8.0 | 2832 | 4.3989 | 0.4367 |
0.0092 | 9.0 | 3186 | 4.1386 | 0.5050 |
0.0084 | 10.0 | 3540 | 4.8677 | 0.4167 |
0.0068 | 11.0 | 3894 | 4.1889 | 0.4483 |
0.0064 | 12.0 | 4248 | 4.4231 | 0.4567 |
0.0045 | 13.0 | 4602 | 5.2198 | 0.4317 |
0.0049 | 14.0 | 4956 | 4.9798 | 0.4017 |
0.0046 | 15.0 | 5310 | 4.8323 | 0.4700 |
0.0042 | 16.0 | 5664 | 4.6532 | 0.4467 |
0.004 | 17.0 | 6018 | 4.8191 | 0.4350 |
0.0018 | 18.0 | 6372 | 4.7262 | 0.4550 |
0.002 | 19.0 | 6726 | 5.0305 | 0.4633 |
0.0019 | 20.0 | 7080 | 5.1902 | 0.4333 |
0.0018 | 21.0 | 7434 | 5.2230 | 0.4433 |
0.0014 | 22.0 | 7788 | 4.8535 | 0.4783 |
0.0016 | 23.0 | 8142 | 5.2069 | 0.4383 |
0.0012 | 24.0 | 8496 | 5.1960 | 0.4483 |
0.0013 | 25.0 | 8850 | 5.4950 | 0.4483 |
0.0018 | 26.0 | 9204 | 5.4242 | 0.4533 |
0.0012 | 27.0 | 9558 | 5.4133 | 0.4483 |
0.0003 | 28.0 | 9912 | 5.5061 | 0.4567 |
0.0009 | 29.0 | 10266 | 5.6092 | 0.4500 |
0.0005 | 30.0 | 10620 | 5.5991 | 0.4400 |
Framework versions
- Transformers 4.13.0
- Pytorch 2.0.0+cu118
- Datasets 2.14.4
- Tokenizers 0.10.3