generated_from_trainer

<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->

BART_pretrained_on_billsum_finetuned_on_small_SCOTUS_extracted_dataset_3

This model is a fine-tuned version of bheshaj/bart-large-billsum-epochs20 on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
6.7036 0.98 10 4.9345 0.1041 0.0255 0.0865 0.0864 20.0
4.9048 1.98 20 4.4400 0.1201 0.0283 0.0973 0.0974 19.9573
4.544 2.98 30 4.2661 0.1272 0.0298 0.1013 0.1007 19.9695
4.3585 3.98 40 4.1790 0.1257 0.0289 0.1007 0.1006 19.8293
4.226 4.98 50 4.1156 0.1229 0.0302 0.0994 0.0994 19.6098
4.1417 5.98 60 4.0503 0.1225 0.0293 0.0985 0.0987 19.6037
4.0406 6.98 70 3.9856 0.1208 0.0309 0.0975 0.0976 19.6159
3.9409 7.98 80 3.9266 0.1231 0.0299 0.0982 0.0984 19.5976
3.8492 8.98 90 3.8767 0.1212 0.0299 0.0953 0.0954 19.6037
3.7571 9.98 100 3.8241 0.1196 0.0313 0.097 0.0972 19.811
3.6975 10.98 110 3.7959 0.121 0.0303 0.0963 0.0963 19.6768
3.5923 11.98 120 3.7628 0.115 0.0315 0.0959 0.0962 19.7012
3.5505 12.98 130 3.7352 0.1166 0.034 0.0952 0.0957 19.6829
3.5027 13.98 140 3.7157 0.1222 0.0347 0.1004 0.1005 19.6341
3.456 14.98 150 3.6983 0.1198 0.032 0.0968 0.097 19.6524
3.4088 15.98 160 3.6644 0.1204 0.0321 0.0969 0.0969 19.4695
3.3511 16.98 170 3.6545 0.1224 0.035 0.1001 0.1004 19.8171
3.3167 17.98 180 3.6415 0.1223 0.0363 0.1006 0.1007 19.7683
3.2786 18.98 190 3.6286 0.1234 0.0345 0.1004 0.1005 19.9756
3.2437 19.98 200 3.6239 0.124 0.0368 0.1006 0.1009 19.6829
3.2114 20.98 210 3.6138 0.1256 0.0393 0.1035 0.104 19.7256
3.1935 21.98 220 3.6025 0.1241 0.0359 0.1016 0.1016 19.5122
3.175 22.98 230 3.5939 0.1213 0.0356 0.1008 0.1011 19.4024
3.1572 23.98 240 3.5979 0.124 0.0355 0.1008 0.1007 19.7256
3.1346 24.98 250 3.5909 0.1247 0.0356 0.1011 0.1014 19.6037
3.1202 25.98 260 3.5877 0.1299 0.0372 0.1042 0.1045 19.7866
3.1095 26.98 270 3.5876 0.13 0.0381 0.1056 0.1059 19.7866
3.0919 27.98 280 3.5851 0.1286 0.04 0.1037 0.1037 19.6951
3.1089 28.98 290 3.5830 0.1253 0.0376 0.1001 0.1002 19.4207
3.0915 29.98 300 3.5827 0.1251 0.037 0.1002 0.1003 19.4207

Framework versions