Model Card of lmqg/mbart-large-cc25-dequad-qag
This model is fine-tuned version of facebook/mbart-large-cc25 for question & answer pair generation task on the lmqg/qag_dequad (dataset_name: default) via lmqg
.
Overview
- Language model: facebook/mbart-large-cc25
- Language: de
- Training data: lmqg/qag_dequad (default)
- Online Demo: https://autoqg.net/
- Repository: https://github.com/asahi417/lm-question-generation
- Paper: https://arxiv.org/abs/2210.03992
Usage
- With
lmqg
from lmqg import TransformersQG
# initialize model
model = TransformersQG(language="de", model="lmqg/mbart-large-cc25-dequad-qag")
# model prediction
question_answer_pairs = model.generate_qa("das erste weltweit errichtete Hermann Brehmer 1855 im niederschlesischen ''Görbersdorf'' (heute Sokołowsko, Polen).")
- With
transformers
from transformers import pipeline
pipe = pipeline("text2text-generation", "lmqg/mbart-large-cc25-dequad-qag")
output = pipe("Empfangs- und Sendeantenne sollen in ihrer Polarisation übereinstimmen, andernfalls wird die Signalübertragung stark gedämpft. ")
Evaluation
- Metric (Question & Answer Generation): raw metric file
Score | Type | Dataset | |
---|---|---|---|
QAAlignedF1Score (BERTScore) | 69.25 | default | lmqg/qag_dequad |
QAAlignedF1Score (MoverScore) | 50.71 | default | lmqg/qag_dequad |
QAAlignedPrecision (BERTScore) | 70.69 | default | lmqg/qag_dequad |
QAAlignedPrecision (MoverScore) | 51.81 | default | lmqg/qag_dequad |
QAAlignedRecall (BERTScore) | 68.05 | default | lmqg/qag_dequad |
QAAlignedRecall (MoverScore) | 49.78 | default | lmqg/qag_dequad |
Training hyperparameters
The following hyperparameters were used during fine-tuning:
- dataset_path: lmqg/qag_dequad
- dataset_name: default
- input_types: ['paragraph']
- output_types: ['questions_answers']
- prefix_types: None
- model: facebook/mbart-large-cc25
- max_length: 512
- max_length_output: 256
- epoch: 6
- batch: 2
- lr: 0.0001
- fp16: False
- random_seed: 1
- gradient_accumulation_steps: 32
- label_smoothing: 0.15
The full configuration can be found at fine-tuning config file.
Citation
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}