Topic -> Question
This model, based on mistral-7B-v0.1 is made for writing intricate questions that require deep analysis to answer from a few word topic (think the title of a Wikipedia page). This model could be useful for generating questions for synthetic Q&A datasets.
Examples:
- Cat -> Considering the rich social and cultural history of cats, how have our perceptions and interactions with these animals evolved over time, and what are some key cultural differences in how humans around the world view felines?
- Benjamin Franklin -> How did Benjamin Franklin's contributions in the fields of electricity, politics, and journalism challenge the prevailing philosophies of his time, and what impact did these contributions have on shaping American values and ideologies?
- Amazon River -> How does the unique blend of flora and fauna found in the Amazon Rainforest contribite to the overall biodiversity of our planet, and what are the potential long-term implications if this rich ecosystem were to be lost due to deforestation or other human activities?
- Mistral Wind -> Considering the impact of the Mistral wind on the Mediterranean climate and ecosystems, how do we balance the need for sustainable energy production with the potential negative effects on these critical systems, and what strategies can be employed to minimize any potential harm?
Prompting:
To prompt the model, use the following prompt (where {topic}
is replaced with the topic you want to generate a question about):
Topic: {topic}\n
Question: '
The response will be in the following format:
Topic: {topic}\n
Question: '{generated_question}' (discard anything here)
(Note that anything after the ending quote of the generated question should be discarded. There is some weird behaviour with the model generating end of text tokens.)