UniNER-7B-type

Description: A UniNER-7B model trained from LLama-7B using the Pile-NER-type data without human-labeled data. The data was collected by prompting gpt-3.5-turbo-0301 to label entities from passages and provide entity tags. The data collection prompt is as follows:

<div style="background-color: #f6f8fa; padding: 20px; border-radius: 10px; border: 1px solid #e1e4e8; box-shadow: 0 2px 5px rgba(0,0,0,0.1);"> <strong>Instruction:</strong><br/> Given a passage, your task is to extract all entities and identify their entity types. The output should be in a list of tuples of the following format: [("entity 1", "type of entity 1"), ... ].</div>

Check our paper for more information. Check our repo about how to use the model.

Comparison with UniNER-7B-definition

The UniNER-7B-type model excels when handling entity tags. It performs better on the Universal NER benchmark, which consists of 43 academic datasets across 9 domains. In contrast, UniNER-7B-definition performs better at processing entity types defined in short sentences and is more robust to type paraphrasing.

Inference

The template for inference instances is as follows: <div style="background-color: #f6f8fa; padding: 20px; border-radius: 10px; border: 1px solid #e1e4e8; box-shadow: 0 2px 5px rgba(0,0,0,0.1);"> <strong>Prompting template:</strong><br/> A virtual assistant answers questions from a user based on the provided text.<br/> USER: Text: <span style="color: #d73a49;">{Fill the input text here}</span><br/> ASSISTANT: I’ve read this text.<br/> USER: What describes <span style="color: #d73a49;">{Fill the entity type here}</span> in the text?<br/> ASSISTANT: <span style="color: #0366d6;">(model's predictions in JSON format)</span><br/> </div>

Note: Inferences are based on one entity type at a time. For multiple entity types, create separate instances for each type.

License

This model and its associated data are released under the CC BY-NC 4.0 license. They are primarily used for research purposes.

Citation

@article{zhou2023universalner,
      title={UniversalNER: Targeted Distillation from Large Language Models for Open Named Entity Recognition}, 
      author={Wenxuan Zhou and Sheng Zhang and Yu Gu and Muhao Chen and Hoifung Poon},
      year={2023},
      eprint={2308.03279},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}