text-classification emotion pytorch

nateraw/bert-base-uncased-emotion converted to ONNX and quantized using optimum.


bert-base-uncased-emotion

Model description

bert-base-uncased finetuned on the emotion dataset using PyTorch Lightning. Sequence length 128, learning rate 2e-5, batch size 32, 2 GPUs, 4 epochs.

For more details, please see, the emotion dataset on nlp viewer.

Limitations and bias

Training data

Data came from HuggingFace's datasets package. The data can be viewed on nlp viewer.

Training procedure

...

Eval results

val_acc - 0.931 (useless, as this should be precision/recall/f1)

The score was calculated using PyTorch Lightning metrics.