DebertaV3ForAIS

Revised !!!!

Model Description

The model is based on the DeBERTa-v3 architecture, a transformer-based model that performs text classification tasks. It has been fine-tuned on a specific dataset to perform text classification with high accuracy.

Model Configuration

Model Parameters

Training Details

The model was trained on a specific dataset with the following settings:

38 epochs

Evaluation Results

Metric Score
MSE 0.0111
RMSE 0.1055
MAE 0.0776
R2 0.6485
Cronbach's Alpha 0.8937

Acknowledgments

This model was pretraine by the authors of DeBERTa-v3 and adapted for text classification tasks. We thank the authors for their contributions to the field of NLP and the Hugging Face team for providing the base DeBERTa-v3 model.

Disclaimer

The model card provides information about the specific configuration and training of the model. However, please note that the performance of the model may vary depending on the specific use case and input data. It is advisable to evaluate the model's performance in your specific context before deploying it in production.