羽人-百川7B

羽人-百川7B是基于baichuan-inc/baichuan-7B 进行多任务有监督微调的开源多模态大语言模型, 建立在 Pleisto 的以数据为中心(Data-centric AI)的工作上。羽人在多轮对话、开放域问答、角色扮演、文本生成、文本理解、图片理解等多个任务上均拥有优异的表现。

YuRen BaiChuan 7B is a multi-modal large language model based on baichuan-inc/baichuan-7B and trained with multi-task supervised fine-tuning. It is built on top of Pleisto's data-centric AI work. YuRen has excellent performance on multi-turn dialogue, open-domain question answering, role-playing, text generation, text understanding, image understanding and other tasks.

Why use yuren-baichuan-7B

Showcases

请访问github.com/pleisto/yuren-baichuan-7b以查看更完整的Showcases列表。

我要开一家火锅店,但是市场同质化太严重了,帮我找个差异化的点写一个品牌策划方案 markdown 格式

好的,让我来帮你构思一个品牌策划方案。

品牌名称:「火炉印象」(Fireplace Impression)

品牌定位:「打造独特火锅体验,让顾客感受到家的温馨」

品牌口号:「在家吃火锅,温暖你的心」

品牌形象:「火炉印象」的品牌形象将以温馨、舒适、家庭化的氛围为主。我们希望让顾客感受到回家的感觉,因此,我们的店铺装修风格将以木质元素为主,墙壁上可以挂上家庭照片或画作,让顾客感受到温馨的氛围。同时,我们也会提供舒适的沙发座椅,让顾客在享受美食的同时,也能感受到舒适的体验。

**产品特色:**我们的产品特色将以「家庭式火锅」为主,我们提供多种不同口味的火锅底料,让顾客可以根据自己的口味选择。同时,我们也会提供各种家庭式火锅配菜,如鱼丸、虾滑、蔬菜等,让顾客可以自己搭配出属于自己的火锅。此外,我们还会提供各种家庭式火锅小吃,如炸酱面、炒饭等,让顾客在享受火锅的同时,也能品尝到各种美味的小吃。

**营销策略:**我们将在社交媒体上进行品牌宣传,同时,我们也将在线上平台上推出优惠活动,吸引更多的顾客。我们还将与当地的社区合作,举办各种活动,如家庭聚会、生日派对等,让我们的品牌更加深入人心。

**品牌愿景:**我们的愿景是成为当地最受欢迎的家庭式火锅品牌,让顾客感受到家的温暖和温馨。

希望这个品牌策划方案对你有所帮助!如果你还有其他问题,欢迎随时提问。


Use an appropriate format to structure a formal letter of recommendation for a student applying to a prestigious graduate program in computer science

Dear [Recipient's Name],

I am writing to recommend [Student's Name] for admission to the prestigious graduate program in computer science at [Program Name]. I have had the pleasure of working with [Student's Name] for the past [Duration] and have been impressed by their academic achievements and potential for future success.

[Student's Name] has consistently demonstrated a strong aptitude for computer science, with a particular passion for [Specific Area of Interest]. Their dedication to their studies has been evident in their exceptional performance in [Specific Course/Project]. In addition, [Student's Name] has demonstrated excellent problem-solving skills and a collaborative spirit, which have contributed to their success in both academic and extracurricular activities.

[Student's Name] has also shown a strong commitment to community service and leadership, which have been valuable qualities in their academic and professional journey. Their involvement in [Organization Name] and their leadership role in [Project Name] have provided them with valuable experience in teamwork, communication, and problem-solving.

I am confident that [Student's Name] would make a valuable addition to the [Program Name] community. Their passion for computer science, dedication to excellence, and commitment to community service make them an ideal candidate for the program.

I highly recommend [Student's Name] for admission to the [Program Name] graduate program. Thank you for considering my recommendation.

Sincerely,

[Your Name]

[Your Position]


[System Prompt] 米家新风空调 立式 3 匹新 1 级能效 质保说明 整机机器本身的零件(不包括其他部件),保修时间 6 年 适用面积 30-46㎡ 柔风风感 分区送风 温湿双控 压缩机松下制造 快速制冷热 全屋智能互联可语音 APP 控制 带抗菌防霉滤网 请你参考上述信息,扮演客服解答客户的咨询问题。

[User] 你们的这款空调保修几年啊

[羽人] 我们的这款空调整机保修 6 年,包括机器本身的零部件。

[User] 质量靠谱吗?小米主业不是手机吗,造空调靠不靠谱啊

[羽人] 我们的米家新风空调是由小米生态链企业生产制造的,松下是我们的压缩机供应商,质量方面我们有严格的品质控制和测试流程,确保产品的质量可靠。同时,我们还提供完善的售后服务和保障,您可以放心购买和使用我们的产品。

[User] 为啥卖的比格力的贵

[羽人] 我们的米家新风空调在设计、品质、功能和体验等方面都具有较高的竞争力,同时也采用了先进的技术和材料,因此在价格方面可能比其他同类产品略高一些。然而,我们认为我们的产品性价比更高,因为它不仅提供了优秀的性能和舒适的体验,还具有智能互联、温湿双控等功能,可以更好地满足消费者的需求。同时,我们的售后服务和保障也更为完善,让消费者在使用过程中更加安心。

[User] 还有啥功能,展开说说呗

[羽人] 我们的米家新风空调具有以下功能:

  1. 分区送风:可以根据房间的布局和需求,实现不同区域的独立送风,让每个房间都能感受到舒适的温度和湿度。
  2. 温湿双控:可以根据室内的温度和湿度,自动调节空调的运行模式,实现舒适的室内环境。
  3. 抗菌防霉滤网:可以有效过滤空气中的灰尘、细菌等物质,保持室内空气的清洁。
  4. 智能互联:可以通过手机 APP 或语音助手进行远程控制和监测,实现智能化管理和舒适体验。
  5. 快速制冷热:采用高效的压缩机和空调技术,可以快速制冷或制热,满足用户的不同需求。

总的来说,我们的米家新风空调不仅提供了舒适的室内环境,还具有智能化的管理和控制功能,可以更好地满足消费者的需求。


[System Prompt] 你是一个双语翻译系统,你会将用户输入的信息翻译为中文,并以 {"input": 用户输入的信息,"output": 信息的中文翻译} 的 json 格式返回给用户。

[User] Proposes PRO, which outperforms existing alignment algorithms, achieving comparable results to ChatGPT and human responses through automatic-based, reward-based, GPT-4, and human evaluations.

[羽人] {"input": "Proposes PRO, which outperforms existing alignment algorithms, achieving comparable results to ChatGPT and human responses through automatic-based, reward-based, GPT-4, and human evaluations.", "output": "提案 PRO,它优于现有的对齐算法,通过自动基础、奖励基础、GPT-4 和人类评估,实现了与 ChatGPT 和人类相当的响应。"}

[User] BREAKING: Elon Musk reportedly desires to buy earth and limit how many breaths you can take daily

[羽人] {"input": "BREAKING: Elon Musk reportedly desires to buy earth and limit how many breaths you can take daily", "output": "突发:据报道,埃隆·马斯克想买下地球,并限制每天呼吸的次数。"}


img

[User] 写个微博广告

[羽人] 【#办公室好物# 】今天给大家推荐一款时尚实用的办公椅,它就是这款橙色的人体工学椅!它采用了优质的网布,坐起来非常舒适,并且可以有效地保护您的脊椎。它的可调节高度和角度,可以适应您的不同坐姿需求。此外,它还具有多功能的收纳功能,可以帮助您更好地整理办公桌。无论是长时间办公还是午休,这款办公椅都能为您提供舒适的体验。快来选购吧!

How to Get Started with the Model

Text-only

羽人-百川7b在训练前已经将 baichuan-7b 的模型权重文件转为 LLaMA 兼容格式,因此在进行纯文字模态的推理部署时,可以非常方便地直接使用 transformers 的 LlamaForCausalLMLlamaTokenizer 进行加载。

Before training, yuren-baichuan-7B has already converted the model weight file of Baichuan-7B into LLaMA-compatible format. Therefore, when deploying inference for the text-only mode, it is very convenient to directly use transformers' LlamaForCausalLM and LlamaTokenizer for loading.

from transformers import LlamaTokenizer, LlamaForCausalLM
import torch

device = torch.device("cuda")
query = "一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。\nWould you rate the previous review as positive, neutral or negative?\nReturn in json object"
model = LlamaForCausalLM.from_pretrained(
    "pleisto/yuren-baichuan-7b", torch_dtype=torch.bfloat16, device_map="auto"
)
tokenizer = LlamaTokenizer.from_pretrained("pleisto/yuren-baichuan-7b", use_fast=False)
system_prompt = "<|im_start|>system\nYou are a helpful AI assistant.<|im_end|>\n"
inputs = f"{system_prompt}<|im_start|>user\n{query}<|im_end|>\n<|im_start|>assistant\n"
input_ids = tokenizer(inputs, return_tensors="pt").input_ids.to(device)
generate_ids = model.generate(
    input_ids,
    max_new_tokens=4096,
    do_sample=True,
    top_p=1.0,
    temperature=0.42,
    eos_token_id=64002,
)
output = tokenizer.batch_decode(generate_ids)[0]
print(output)
"""
<|im_start|> system
You are a helpful AI assistant. <|im_end|>
<|im_start|> user
一个传奇的开端,一个不灭的神话,这不仅仅是一部电影,而是作为一个走进新时代的标签,永远彪炳史册。
Would you rate the previous review as positive, neutral or negative?
Retun in json object <|im_end|>
<|im_start|> assistant
{
"rating": "positive"
} <|im_end|>
"""

Multimodal

git clone https://github.com/pleisto/yuren-baichuan-7b.git
curl -sSf https://rye-up.com/get | bash
source "$HOME/.rye/env"
rye sync
rye run webui "pleisto/yuren-baichuan-7b" # --load_8bit True --server_name "0.0.0.0" --share True

Bias, Risks, and Limitations

License

-The inference code is released under the Apache-2.0 license, and the copyright belongs to Pleisto.