medical

<div style="width: 100%;"> <img src="http://x-pai.algolet.com/bot/img/logo_core.png" alt="TigerBot" style="width: 20%; display: block; margin: auto;"> </div> <p align="center"> <font face="黑体" size=5"> A cutting-edge foundation for your very own LLM. </font> </p> <p align="center"> 🌐 <a href="https://tigerbot.com/" target="_blank">TigerBot</a> • 🤗 <a href="https://huggingface.co/TigerResearch" target="_blank">Hugging Face</a> </p>

基于tigerbot-7b-sft,使用QLoRA,在medical-qa数据上微调的模型。 如果下载过tigerbot-7b-sft,可以只下载qlora_ckpt_2400_adapter_model后合成。

import transformers
from peft import PeftModel

model = transformers.AutoModelForCausalLM.from_pretrained("TigerResearch/tigerbot-7b-sft")
model = PeftModel.from_pretrained(model, './adapter_model', is_trainable=False)
model = model_peft.merge_and_unload()

或者完整加载整个模型

from transformers import AutoTokenizer, AutoModelForCausalLM
from accelerate import infer_auto_device_map, dispatch_model
from accelerate.utils import get_balanced_memory

tokenizer = AutoTokenizer.from_pretrained("TigerResearch/medical-bot-peft-from-tigerbot-7b-sft")

model = AutoModelForCausalLM.from_pretrained("TigerResearch/medical-bot-peft-from-tigerbot-7b-sft")