generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

roberta-finetuned-token-reqadjinsiders

This model is a fine-tuned version of PlanTL-GOB-ES/roberta-base-bne on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss F1 B-cadj F1 I-cadj F1 B-peso F1 I-peso Macro F1
0.2188 1.0 10 0.0758 0 0 0 0 0.0
0.0711 2.0 20 0.0678 0 0 0 0 0.0
0.0656 3.0 30 0.0639 0 0 0 0 0.0
0.0634 4.0 40 0.0629 0 0 0 0 0.0
0.0624 5.0 50 0.0615 0 0 0 0 0.0
0.0602 6.0 60 0.0598 0 0 0 0 0.0
0.0573 7.0 70 0.0628 0 0 0 0 0.0
0.0537 8.0 80 0.0531 0 0.1373 0 0 0.0343
0.0595 9.0 90 0.0565 0 0 0 0 0.0
0.0435 10.0 100 0.0659 0 0 0 0 0.0
0.0507 11.0 110 0.0558 0 0.4549 0 0 0.1137
0.0313 12.0 120 0.0561 0 0.3955 0 0 0.0989
0.0278 13.0 130 0.0629 0 0.3756 0 0 0.0939
0.0248 14.0 140 0.0634 0 0.3726 0 0 0.0932
0.0282 15.0 150 0.0607 0 0.3303 0 0 0.0826
0.0302 16.0 160 0.0628 0 0.4428 0 0 0.1107
0.0205 17.0 170 0.0551 0 0.3855 0 0 0.0964
0.0186 18.0 180 0.0627 0 0.4419 0 0 0.1105
0.0171 19.0 190 0.0721 0 0.3524 0 0 0.0881
0.0152 20.0 200 0.0574 0 0.3281 0 0 0.0820
0.0152 21.0 210 0.0597 0 0.1515 0 0 0.0379
0.0157 22.0 220 0.0675 0 0.3633 0 0 0.0908
0.0135 23.0 230 0.0728 0 0.3135 0 0 0.0784
0.0128 24.0 240 0.0703 0 0.4114 0 0 0.1028
0.0126 25.0 250 0.0605 0 0.3695 0 0 0.0924
0.0228 26.0 260 0.0490 0 0 0 0 0.0
0.0266 27.0 270 0.0819 0 0.2214 0 0 0.0554
0.0512 28.0 280 0.0598 0 0 0 0 0.0
0.0625 29.0 290 0.0595 0 0 0 0 0.0
0.0601 30.0 300 0.0593 0 0 0 0 0.0
0.0595 31.0 310 0.0594 0 0 0 0 0.0
0.0597 32.0 320 0.0593 0 0 0 0 0.0
0.0594 33.0 330 0.0593 0 0 0 0 0.0
0.0599 34.0 340 0.0593 0 0 0 0 0.0
0.0601 35.0 350 0.0599 0 0 0 0 0.0
0.0597 36.0 360 0.0593 0 0 0 0 0.0
0.0595 37.0 370 0.0594 0 0 0 0 0.0
0.0595 38.0 380 0.0593 0 0 0 0 0.0
0.0594 39.0 390 0.0593 0 0 0 0 0.0
0.0591 40.0 400 0.0592 0 0 0 0 0.0
0.0595 41.0 410 0.0594 0 0 0 0 0.0
0.0594 42.0 420 0.0593 0 0 0 0 0.0
0.0593 43.0 430 0.0593 0 0 0 0 0.0
0.0593 44.0 440 0.0592 0 0 0 0 0.0
0.0592 45.0 450 0.0593 0 0 0 0 0.0
0.0593 46.0 460 0.0592 0 0 0 0 0.0
0.0593 47.0 470 0.0592 0 0 0 0 0.0
0.0594 48.0 480 0.0592 0 0 0 0 0.0
0.0596 49.0 490 0.0593 0 0 0 0 0.0
0.0594 50.0 500 0.0593 0 0 0 0 0.0
0.0595 51.0 510 0.0592 0 0 0 0 0.0
0.0591 52.0 520 0.0592 0 0 0 0 0.0
0.0593 53.0 530 0.0592 0 0 0 0 0.0
0.0597 54.0 540 0.0593 0 0 0 0 0.0
0.0593 55.0 550 0.0593 0 0 0 0 0.0
0.0601 56.0 560 0.0594 0 0 0 0 0.0
0.0595 57.0 570 0.0593 0 0 0 0 0.0
0.0595 58.0 580 0.0592 0 0 0 0 0.0
0.0592 59.0 590 0.0592 0 0 0 0 0.0
0.0593 60.0 600 0.0592 0 0 0 0 0.0
0.0591 61.0 610 0.0592 0 0 0 0 0.0
0.0594 62.0 620 0.0592 0 0 0 0 0.0
0.0593 63.0 630 0.0592 0 0 0 0 0.0
0.0595 64.0 640 0.0592 0 0 0 0 0.0
0.0592 65.0 650 0.0592 0 0 0 0 0.0
0.0581 66.0 660 0.0580 0 0 0 0 0.0
0.0503 67.0 670 0.0608 0 0 0 0 0.0
0.0338 68.0 680 0.0653 0 0.2763 0 0 0.0691
0.0311 69.0 690 0.0727 0 0.3019 0 0 0.0755
0.0305 70.0 700 0.0683 0 0.3515 0 0 0.0879
0.0236 71.0 710 0.0757 0 0.2626 0 0 0.0656
0.0261 72.0 720 0.0597 0 0.4734 0 0 0.1183
0.0242 73.0 730 0.0621 0 0.4411 0 0 0.1103
0.0293 74.0 740 0.0722 0 0.3305 0 0 0.0826
0.0465 75.0 750 0.0600 0 0 0 0 0.0
0.0598 76.0 760 0.0592 0 0 0 0 0.0
0.0594 77.0 770 0.0593 0 0 0 0 0.0
0.0594 78.0 780 0.0593 0 0 0 0 0.0
0.0602 79.0 790 0.0593 0 0 0 0 0.0
0.0598 80.0 800 0.0592 0 0 0 0 0.0
0.0595 81.0 810 0.0592 0 0 0 0 0.0
0.0591 82.0 820 0.0592 0 0 0 0 0.0
0.0591 83.0 830 0.0592 0 0 0 0 0.0
0.0591 84.0 840 0.0592 0 0 0 0 0.0
0.0594 85.0 850 0.0592 0 0 0 0 0.0
0.0592 86.0 860 0.0592 0 0 0 0 0.0
0.0597 87.0 870 0.0592 0 0 0 0 0.0
0.0593 88.0 880 0.0593 0 0 0 0 0.0
0.06 89.0 890 0.0595 0 0 0 0 0.0
0.0591 90.0 900 0.0594 0 0 0 0 0.0
0.0592 91.0 910 0.0592 0 0 0 0 0.0
0.0592 92.0 920 0.0592 0 0 0 0 0.0
0.0591 93.0 930 0.0592 0 0 0 0 0.0
0.0592 94.0 940 0.0592 0 0 0 0 0.0
0.0592 95.0 950 0.0593 0 0 0 0 0.0
0.0592 96.0 960 0.0593 0 0 0 0 0.0
0.0591 97.0 970 0.0592 0 0 0 0 0.0
0.0596 98.0 980 0.0593 0 0 0 0 0.0
0.0593 99.0 990 0.0592 0 0 0 0 0.0
0.0594 100.0 1000 0.0593 0 0 0 0 0.0
0.0596 101.0 1010 0.0593 0 0 0 0 0.0
0.0594 102.0 1020 0.0593 0 0 0 0 0.0
0.0593 103.0 1030 0.0593 0 0 0 0 0.0
0.0591 104.0 1040 0.0592 0 0 0 0 0.0
0.059 105.0 1050 0.0592 0 0 0 0 0.0
0.0593 106.0 1060 0.0592 0 0 0 0 0.0
0.0595 107.0 1070 0.0592 0 0 0 0 0.0
0.0599 108.0 1080 0.0592 0 0 0 0 0.0
0.0598 109.0 1090 0.0594 0 0 0 0 0.0
0.0595 110.0 1100 0.0594 0 0 0 0 0.0
0.0594 111.0 1110 0.0593 0 0 0 0 0.0
0.0596 112.0 1120 0.0592 0 0 0 0 0.0
0.0594 113.0 1130 0.0593 0 0 0 0 0.0
0.0592 114.0 1140 0.0593 0 0 0 0 0.0
0.0594 115.0 1150 0.0592 0 0 0 0 0.0
0.059 116.0 1160 0.0592 0 0 0 0 0.0
0.0594 117.0 1170 0.0592 0 0 0 0 0.0
0.0594 118.0 1180 0.0592 0 0 0 0 0.0
0.0593 119.0 1190 0.0592 0 0 0 0 0.0
0.0591 120.0 1200 0.0592 0 0 0 0 0.0
0.0591 121.0 1210 0.0592 0 0 0 0 0.0
0.0591 122.0 1220 0.0592 0 0 0 0 0.0
0.0592 123.0 1230 0.0592 0 0 0 0 0.0
0.0592 124.0 1240 0.0592 0 0 0 0 0.0
0.0592 125.0 1250 0.0592 0 0 0 0 0.0
0.059 126.0 1260 0.0592 0 0 0 0 0.0
0.0592 127.0 1270 0.0592 0 0 0 0 0.0
0.0595 128.0 1280 0.0592 0 0 0 0 0.0
0.0592 129.0 1290 0.0592 0 0 0 0 0.0
0.059 130.0 1300 0.0592 0 0 0 0 0.0
0.059 131.0 1310 0.0592 0 0 0 0 0.0
0.0593 132.0 1320 0.0592 0 0 0 0 0.0
0.0594 133.0 1330 0.0593 0 0 0 0 0.0
0.0592 134.0 1340 0.0592 0 0 0 0 0.0
0.0596 135.0 1350 0.0592 0 0 0 0 0.0
0.0594 136.0 1360 0.0592 0 0 0 0 0.0
0.0596 137.0 1370 0.0592 0 0 0 0 0.0
0.0592 138.0 1380 0.0592 0 0 0 0 0.0
0.0591 139.0 1390 0.0592 0 0 0 0 0.0
0.0595 140.0 1400 0.0593 0 0 0 0 0.0
0.0591 141.0 1410 0.0592 0 0 0 0 0.0
0.0594 142.0 1420 0.0592 0 0 0 0 0.0
0.0591 143.0 1430 0.0593 0 0 0 0 0.0
0.0591 144.0 1440 0.0592 0 0 0 0 0.0
0.059 145.0 1450 0.0592 0 0 0 0 0.0
0.059 146.0 1460 0.0592 0 0 0 0 0.0
0.059 147.0 1470 0.0592 0 0 0 0 0.0
0.0592 148.0 1480 0.0592 0 0 0 0 0.0
0.0592 149.0 1490 0.0592 0 0 0 0 0.0
0.0592 150.0 1500 0.0592 0 0 0 0 0.0
0.0591 151.0 1510 0.0592 0 0 0 0 0.0
0.0592 152.0 1520 0.0592 0 0 0 0 0.0
0.0591 153.0 1530 0.0592 0 0 0 0 0.0
0.0591 154.0 1540 0.0592 0 0 0 0 0.0
0.059 155.0 1550 0.0592 0 0 0 0 0.0
0.0591 156.0 1560 0.0592 0 0 0 0 0.0
0.0591 157.0 1570 0.0592 0 0 0 0 0.0
0.0597 158.0 1580 0.0592 0 0 0 0 0.0
0.0592 159.0 1590 0.0592 0 0 0 0 0.0
0.0592 160.0 1600 0.0592 0 0 0 0 0.0
0.0597 161.0 1610 0.0592 0 0 0 0 0.0
0.0592 162.0 1620 0.0592 0 0 0 0 0.0
0.0594 163.0 1630 0.0592 0 0 0 0 0.0
0.0595 164.0 1640 0.0592 0 0 0 0 0.0
0.0597 165.0 1650 0.0592 0 0 0 0 0.0
0.0592 166.0 1660 0.0592 0 0 0 0 0.0
0.0591 167.0 1670 0.0592 0 0 0 0 0.0
0.0593 168.0 1680 0.0592 0 0 0 0 0.0
0.0597 169.0 1690 0.0592 0 0 0 0 0.0
0.0591 170.0 1700 0.0592 0 0 0 0 0.0
0.059 171.0 1710 0.0592 0 0 0 0 0.0
0.0592 172.0 1720 0.0592 0 0 0 0 0.0
0.0592 173.0 1730 0.0592 0 0 0 0 0.0
0.059 174.0 1740 0.0592 0 0 0 0 0.0
0.0591 175.0 1750 0.0592 0 0 0 0 0.0
0.0589 176.0 1760 0.0592 0 0 0 0 0.0
0.0591 177.0 1770 0.0592 0 0 0 0 0.0
0.0591 178.0 1780 0.0592 0 0 0 0 0.0
0.0592 179.0 1790 0.0592 0 0 0 0 0.0
0.0592 180.0 1800 0.0592 0 0 0 0 0.0
0.059 181.0 1810 0.0592 0 0 0 0 0.0
0.0591 182.0 1820 0.0592 0 0 0 0 0.0
0.059 183.0 1830 0.0592 0 0 0 0 0.0
0.0592 184.0 1840 0.0592 0 0 0 0 0.0
0.0593 185.0 1850 0.0592 0 0 0 0 0.0
0.0591 186.0 1860 0.0592 0 0 0 0 0.0
0.0591 187.0 1870 0.0592 0 0 0 0 0.0
0.0598 188.0 1880 0.0592 0 0 0 0 0.0
0.0591 189.0 1890 0.0592 0 0 0 0 0.0
0.0591 190.0 1900 0.0592 0 0 0 0 0.0
0.0592 191.0 1910 0.0592 0 0 0 0 0.0
0.0599 192.0 1920 0.0592 0 0 0 0 0.0
0.0593 193.0 1930 0.0592 0 0 0 0 0.0
0.0595 194.0 1940 0.0592 0 0 0 0 0.0
0.0593 195.0 1950 0.0592 0 0 0 0 0.0
0.0592 196.0 1960 0.0592 0 0 0 0 0.0
0.059 197.0 1970 0.0592 0 0 0 0 0.0
0.0589 198.0 1980 0.0592 0 0 0 0 0.0
0.0592 199.0 1990 0.0592 0 0 0 0 0.0
0.0589 200.0 2000 0.0592 0 0 0 0 0.0

Framework versions