akhisreelibra/bert-malayalam-pos-tagger
This model is a fine-tuned version of bert-base-multilingual-uncased on a set of tagged Malayalam sentence dataset It achieves the following results on the evaluation set:
- Loss: 0.4383
- Precision: 0.7380
- Recall: 0.7767
- F1: 0.7569
- Accuracy: 0.8552
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
| Tag abbreviation | Tag name |
|---|---|
| 'CC_CCD' | Co-ordinator |
| 'CC_CCS' | Subordinator |
| 'CC_CCS_UT' | Quotative |
| 'DM_DMD' | Deictic demonstrative |
| 'DM_DMQ' | Wh-word |
| 'DM_DMR' | Relative demonstrative |
| 'JJ' | Adjective |
| 'N_NN' | Common noun |
| 'N_NNP' | Proper noun |
| 'N_NST' | Locative noun |
| 'PR_PRC' | Reciprocal pronoun |
| 'PR_PRF' | Reflexive pronoun |
| 'PR_PRL' | Relative pronoun |
| 'PR_PRP' | Personal pronoun |
| 'PR_PRQ' | Wh-word |
| 'PSP' | Postposition |
| 'QT_QTC' | Cardinals |
| 'QT_QTF' | General quantifier |
| 'QT_QTO' | Ordinals |
| 'RB' | Adverb |
| 'RD_ECH' | Echo words |
| 'RD_RDF' | Foreign words |
| 'RD_SYM' | Symbol |
| 'RD_UNK' | Unknown |
| 'RP_CL' | Classifier particle |
| 'RP_INJ' | Interjection particle |
| 'RP_INTF' | Intensifier particle |
| 'RP_NEG' | Negation particle |
| 'RP_RPD' | Default particle |
| 'V_VAUX' | Auxiliary verb |
| 'V_VM' | Main verb |
| 'V_VM_VF' | Finite verb |
| 'V_VM_VINF' | Infinite verb |
| 'V_VM_VNF' | Non-finite verb |
| 'V_VN' | Verbal noun |
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.518 | 1.0 | 2692 | 0.4987 | 0.7124 | 0.7415 | 0.7267 | 0.8374 |
| 0.4415 | 2.0 | 5384 | 0.4515 | 0.7221 | 0.7679 | 0.7443 | 0.8481 |
| 0.3645 | 3.0 | 8076 | 0.4383 | 0.7380 | 0.7767 | 0.7569 | 0.8552 |
Framework versions
- Transformers 4.18.0
- Pytorch 1.11.0
- Datasets 2.1.0
- Tokenizers 0.12.1