generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

detect-femicide-news-xlmr-nl-fft-freeze2

This model is a fine-tuned version of xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Neg Precision Pos Recall Neg Recall Pos F1 Score Neg F1 Score Pos
1.3215 1.0 23 1.0782 0.75 0.7391 0.8 0.9444 0.4 0.8293 0.5333
1.0955 2.0 46 0.9057 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.9344 3.0 69 0.7420 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.8303 4.0 92 0.5952 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.6728 5.0 115 0.5078 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.649 6.0 138 0.4546 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.6008 7.0 161 0.4454 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5439 8.0 184 0.4495 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5557 9.0 207 0.4479 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5637 10.0 230 0.4470 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5709 11.0 253 0.4500 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5496 12.0 276 0.4456 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5586 13.0 299 0.4484 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.562 14.0 322 0.4435 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5555 15.0 345 0.4427 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5449 16.0 368 0.4404 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.555 17.0 391 0.4384 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5541 18.0 414 0.4383 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5463 19.0 437 0.4379 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5548 20.0 460 0.4357 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5365 21.0 483 0.4342 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5473 22.0 506 0.4308 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5467 23.0 529 0.4309 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.543 24.0 552 0.4312 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.543 25.0 575 0.4289 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5309 26.0 598 0.4290 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5406 27.0 621 0.4246 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5295 28.0 644 0.4248 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.535 29.0 667 0.4247 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5401 30.0 690 0.4265 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5285 31.0 713 0.4262 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5492 32.0 736 0.4247 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5473 33.0 759 0.4224 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.547 34.0 782 0.4250 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5394 35.0 805 0.4280 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5361 36.0 828 0.4247 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5294 37.0 851 0.4238 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5302 38.0 874 0.4236 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5384 39.0 897 0.4215 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5396 40.0 920 0.4209 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5305 41.0 943 0.4192 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5241 42.0 966 0.4204 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5433 43.0 989 0.4190 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5246 44.0 1012 0.4169 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.525 45.0 1035 0.4177 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5306 46.0 1058 0.4169 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5228 47.0 1081 0.4167 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5094 48.0 1104 0.4176 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5207 49.0 1127 0.4170 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5087 50.0 1150 0.4169 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5229 51.0 1173 0.4163 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5221 52.0 1196 0.4160 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5147 53.0 1219 0.4166 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.524 54.0 1242 0.4157 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5171 55.0 1265 0.4149 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5116 56.0 1288 0.4138 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5373 57.0 1311 0.4139 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5274 58.0 1334 0.4131 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5375 59.0 1357 0.4133 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.528 60.0 1380 0.4136 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5282 61.0 1403 0.4147 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.528 62.0 1426 0.4142 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5357 63.0 1449 0.4132 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5177 64.0 1472 0.4132 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5358 65.0 1495 0.4133 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5224 66.0 1518 0.4124 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5121 67.0 1541 0.4125 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5394 68.0 1564 0.4137 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.52 69.0 1587 0.4140 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5103 70.0 1610 0.4131 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5224 71.0 1633 0.4134 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5351 72.0 1656 0.4129 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5181 73.0 1679 0.4138 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.532 74.0 1702 0.4139 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5216 75.0 1725 0.4142 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5083 76.0 1748 0.4138 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.531 77.0 1771 0.4132 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5245 78.0 1794 0.4125 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5191 79.0 1817 0.4127 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.516 80.0 1840 0.4126 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5098 81.0 1863 0.4128 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5173 82.0 1886 0.4127 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5119 83.0 1909 0.4129 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5296 84.0 1932 0.4125 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5105 85.0 1955 0.4131 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5108 86.0 1978 0.4124 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5156 87.0 2001 0.4125 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5143 88.0 2024 0.4124 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5099 89.0 2047 0.4122 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5163 90.0 2070 0.4120 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5224 91.0 2093 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.4936 92.0 2116 0.4120 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5236 93.0 2139 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5261 94.0 2162 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5134 95.0 2185 0.4119 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5064 96.0 2208 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5072 97.0 2231 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5264 98.0 2254 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5344 99.0 2277 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.522 100.0 2300 0.4119 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778

Framework versions