generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

training

This model is a fine-tuned version of roberta-base on the cynthiachan/FeedRef_10pct dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Attackid Precision Attackid Recall Attackid F1 Attackid Number Cve Precision Cve Recall Cve F1 Cve Number Defenderthreat Precision Defenderthreat Recall Defenderthreat F1 Defenderthreat Number Domain Precision Domain Recall Domain F1 Domain Number Email Precision Email Recall Email F1 Email Number Filepath Precision Filepath Recall Filepath F1 Filepath Number Hostname Precision Hostname Recall Hostname F1 Hostname Number Ipv4 Precision Ipv4 Recall Ipv4 F1 Ipv4 Number Md5 Precision Md5 Recall Md5 F1 Md5 Number Sha1 Precision Sha1 Recall Sha1 F1 Sha1 Number Sha256 Precision Sha256 Recall Sha256 F1 Sha256 Number Uri Precision Uri Recall Uri F1 Uri Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.4353 0.37 500 0.3525 0.0 0.0 0.0 6 0.0 0.0 0.0 11 0.0 0.0 0.0 2 0.0 0.0 0.0 23 0.0 0.0 0.0 3 0.3984 0.6182 0.4846 165 0.0714 0.3333 0.1176 12 0.0 0.0 0.0 12 0.8936 0.8077 0.8485 52 0.0 0.0 0.0 7 0.4937 0.8864 0.6341 44 0.0 0.0 0.0 1 0.4156 0.5533 0.4746 0.9459
0.2089 0.75 1000 0.1812 0.0 0.0 0.0 6 0.9 0.8182 0.8571 11 0.0 0.0 0.0 2 0.15 0.2609 0.1905 23 0.0 0.0 0.0 3 0.6432 0.7758 0.7033 165 0.0 0.0 0.0 12 0.6471 0.9167 0.7586 12 0.7143 0.8654 0.7826 52 0.0 0.0 0.0 7 0.5286 0.8409 0.6491 44 0.0 0.0 0.0 1 0.5315 0.6982 0.6036 0.9626
0.1453 1.12 1500 0.1374 0.75 0.5 0.6 6 0.9167 1.0 0.9565 11 0.0 0.0 0.0 2 0.5135 0.8261 0.6333 23 0.0 0.0 0.0 3 0.6863 0.8485 0.7588 165 0.7 0.5833 0.6364 12 0.6667 0.6667 0.6667 12 0.8167 0.9423 0.8750 52 0.0 0.0 0.0 7 0.8333 0.9091 0.8696 44 0.0 0.0 0.0 1 0.7048 0.8195 0.7579 0.9745
0.1277 1.5 2000 0.1400 1.0 1.0 1.0 6 1.0 1.0 1.0 11 0.0 0.0 0.0 2 0.7273 0.6957 0.7111 23 0.2 0.3333 0.25 3 0.7181 0.8182 0.7649 165 0.9167 0.9167 0.9167 12 0.7857 0.9167 0.8462 12 0.8167 0.9423 0.8750 52 0.0 0.0 0.0 7 0.8302 1.0 0.9072 44 0.0 0.0 0.0 1 0.7634 0.8402 0.8000 0.9735
0.1074 1.87 2500 0.1101 1.0 1.0 1.0 6 1.0 1.0 1.0 11 0.0 0.0 0.0 2 0.72 0.7826 0.7500 23 0.2857 0.6667 0.4 3 0.7554 0.8424 0.7966 165 0.8571 1.0 0.9231 12 0.8182 0.75 0.7826 12 0.9259 0.9615 0.9434 52 0.0 0.0 0.0 7 0.6833 0.9318 0.7885 44 0.0 0.0 0.0 1 0.7660 0.8521 0.8067 0.9762
0.0758 2.25 3000 0.1161 1.0 1.0 1.0 6 1.0 1.0 1.0 11 0.0 0.0 0.0 2 0.9091 0.8696 0.8889 23 0.5 0.6667 0.5714 3 0.8251 0.9152 0.8678 165 1.0 1.0 1.0 12 1.0 0.6667 0.8 12 0.9259 0.9615 0.9434 52 1.0 0.5714 0.7273 7 0.8958 0.9773 0.9348 44 0.0 0.0 0.0 1 0.8722 0.9083 0.8899 0.9814
0.064 2.62 3500 0.1275 1.0 1.0 1.0 6 0.8333 0.9091 0.8696 11 0.0 0.0 0.0 2 0.8947 0.7391 0.8095 23 1.0 1.0 1.0 3 0.8418 0.9030 0.8713 165 0.8571 1.0 0.9231 12 1.0 0.75 0.8571 12 0.9245 0.9423 0.9333 52 0.6667 0.5714 0.6154 7 0.8113 0.9773 0.8866 44 0.0 0.0 0.0 1 0.8580 0.8935 0.8754 0.9793
0.0522 3.0 4000 0.1033 1.0 1.0 1.0 6 1.0 1.0 1.0 11 0.0 0.0 0.0 2 0.8636 0.8261 0.8444 23 1.0 1.0 1.0 3 0.8108 0.9091 0.8571 165 0.9231 1.0 0.9600 12 0.9167 0.9167 0.9167 12 0.875 0.9423 0.9074 52 0.75 0.8571 0.8000 7 0.8 1.0 0.8889 44 0.0 0.0 0.0 1 0.8383 0.9201 0.8773 0.9816

Framework versions