generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

multiberts-seed_0_winobias_classifieronly

This model is a fine-tuned version of google/multiberts-seed_0 on the None dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Accuracy Tp Tn Fp Fn
0.6992 0.8 20 0.6945 0.4943 0.2285 0.2658 0.2342 0.2715
0.7005 1.6 40 0.6944 0.4937 0.2702 0.2235 0.2765 0.2298
0.7025 2.4 60 0.6958 0.5032 0.4457 0.0574 0.4426 0.0543
0.6937 3.2 80 0.6943 0.5044 0.3403 0.1641 0.3359 0.1597
0.6935 4.0 100 0.6942 0.4943 0.2847 0.2096 0.2904 0.2153
0.6995 4.8 120 0.6942 0.4962 0.2702 0.2260 0.2740 0.2298
0.7035 5.6 140 0.6955 0.5 0.4324 0.0676 0.4324 0.0676
0.7001 6.4 160 0.6941 0.5 0.2437 0.2563 0.2437 0.2563
0.6967 7.2 180 0.6941 0.5032 0.2418 0.2614 0.2386 0.2582
0.6934 8.0 200 0.6943 0.4987 0.3472 0.1515 0.3485 0.1528
0.6963 8.8 220 0.6941 0.4956 0.3112 0.1843 0.3157 0.1888
0.6962 9.6 240 0.6940 0.5006 0.3112 0.1894 0.3106 0.1888
0.6955 10.4 260 0.6944 0.4949 0.3580 0.1370 0.3630 0.1420
0.7032 11.2 280 0.6941 0.4912 0.2683 0.2229 0.2771 0.2317
0.6996 12.0 300 0.6943 0.4949 0.3580 0.1370 0.3630 0.1420
0.6995 12.8 320 0.6940 0.5032 0.3188 0.1843 0.3157 0.1812
0.6985 13.6 340 0.6939 0.5057 0.2330 0.2727 0.2273 0.2670
0.696 14.4 360 0.6947 0.5038 0.4211 0.0827 0.4173 0.0789
0.6988 15.2 380 0.6941 0.4937 0.2854 0.2083 0.2917 0.2146
0.6998 16.0 400 0.6942 0.5044 0.2191 0.2854 0.2146 0.2809
0.6971 16.8 420 0.6940 0.5063 0.2323 0.2740 0.2260 0.2677
0.704 17.6 440 0.6939 0.4937 0.2664 0.2273 0.2727 0.2336
0.695 18.4 460 0.6940 0.4956 0.2797 0.2159 0.2841 0.2203
0.6992 19.2 480 0.6940 0.4949 0.2872 0.2077 0.2923 0.2128
0.6869 20.0 500 0.6940 0.4880 0.3150 0.1730 0.3270 0.1850
0.7022 20.8 520 0.6939 0.4968 0.3239 0.1730 0.3270 0.1761
0.6961 21.6 540 0.6942 0.5019 0.3857 0.1162 0.3838 0.1143
0.7024 22.4 560 0.6940 0.4962 0.3453 0.1509 0.3491 0.1547
0.6995 23.2 580 0.6940 0.5019 0.3542 0.1477 0.3523 0.1458
0.7 24.0 600 0.6939 0.5019 0.3131 0.1888 0.3112 0.1869
0.7035 24.8 620 0.6939 0.5 0.3396 0.1604 0.3396 0.1604
0.6978 25.6 640 0.6938 0.5133 0.2064 0.3068 0.1932 0.2936
0.7032 26.4 660 0.6938 0.4981 0.2399 0.2582 0.2418 0.2601
0.6898 27.2 680 0.6938 0.4956 0.2449 0.2506 0.2494 0.2551
0.6923 28.0 700 0.6939 0.4924 0.2929 0.1995 0.3005 0.2071
0.7025 28.8 720 0.6938 0.4968 0.2809 0.2159 0.2841 0.2191
0.6999 29.6 740 0.6943 0.5025 0.3838 0.1187 0.3813 0.1162
0.6993 30.4 760 0.6940 0.5006 0.3573 0.1433 0.3567 0.1427
0.7013 31.2 780 0.6939 0.4968 0.3333 0.1635 0.3365 0.1667
0.6985 32.0 800 0.6939 0.4981 0.3460 0.1521 0.3479 0.1540
0.6957 32.8 820 0.6938 0.5076 0.3277 0.1799 0.3201 0.1723
0.6978 33.6 840 0.6937 0.5107 0.2475 0.2633 0.2367 0.2525
0.6953 34.4 860 0.6938 0.5133 0.3201 0.1932 0.3068 0.1799
0.6923 35.2 880 0.6939 0.5 0.3485 0.1515 0.3485 0.1515
0.6947 36.0 900 0.6939 0.4987 0.3491 0.1496 0.3504 0.1509
0.7018 36.8 920 0.6937 0.5095 0.3018 0.2077 0.2923 0.1982
0.6957 37.6 940 0.6939 0.4975 0.3403 0.1572 0.3428 0.1597
0.698 38.4 960 0.6939 0.4981 0.3415 0.1566 0.3434 0.1585
0.6902 39.2 980 0.6938 0.4968 0.3289 0.1679 0.3321 0.1711
0.6973 40.0 1000 0.6938 0.4956 0.3346 0.1610 0.3390 0.1654
0.6962 40.8 1020 0.6938 0.5025 0.3194 0.1831 0.3169 0.1806
0.699 41.6 1040 0.6937 0.5019 0.3093 0.1926 0.3074 0.1907
0.6965 42.4 1060 0.6937 0.5013 0.2816 0.2197 0.2803 0.2184
0.694 43.2 1080 0.6937 0.5013 0.2677 0.2336 0.2664 0.2323
0.699 44.0 1100 0.6937 0.4962 0.2664 0.2298 0.2702 0.2336
0.6932 44.8 1120 0.6937 0.4975 0.2645 0.2330 0.2670 0.2355
0.6994 45.6 1140 0.6937 0.5006 0.2696 0.2311 0.2689 0.2304
0.7008 46.4 1160 0.6937 0.5095 0.2992 0.2102 0.2898 0.2008
0.7006 47.2 1180 0.6937 0.5082 0.3049 0.2033 0.2967 0.1951
0.6954 48.0 1200 0.6937 0.5101 0.3030 0.2071 0.2929 0.1970
0.7044 48.8 1220 0.6937 0.5088 0.3030 0.2058 0.2942 0.1970
0.6993 49.6 1240 0.6937 0.5088 0.3011 0.2077 0.2923 0.1989

Framework versions