Llama3-8B-Chinese-Chat 聊天机器人

发布于:2024-06-19 ⋅ 阅读:(130) ⋅ 点赞:(0)

1. 创建虚拟环境

 conda create -n llama3-chinese python=3.11 -y
 conda activate llama3-chinese

2. 安装 pytorch

pip install torch==2.2.2 --index-url https://pypi.tuna.tsinghua.edu.cn/simple

3. 安装 transformers 和 gradio

pip install transformers gradio -i https://pypi.tuna.tsinghua.edu.cn/simple

4. 开发 ui 代码

web-ui.py

import gradio as gr
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

model_id = "shenzhi-wang/Llama3-8B-Chinese-Chat"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
    model_id, torch_dtype=torch.float16, device_map="auto"
)

def chatbot(message):
    messages = [{"role": "user", "content": message}]
    input_ids = tokenizer.apply_chat_template(
        messages, add_generation_prompt=True, return_tensors="pt"
    ).to(model.device)

    outputs = model.generate(
        input_ids,
        max_new_tokens=8192,
        do_sample=True,
        temperature=0.6,
        top_p=0.9,
    )
    response = outputs[0][input_ids.shape[-1]:]
    return tokenizer.decode(response, skip_special_tokens=True)

iface = gr.Interface(
    fn=chatbot,
    inputs=gr.Textbox(lines=2, placeholder="输入你的消息..."),
    outputs="text",
    title="中文聊天机器人",
    description="输入消息与聊天机器人进行对话。"
)

iface.launch()

5. 运行 ui 代码

python web-ui.py

6. 访问 web ui

使用浏览器打开 http://127.0.0.1:7860

完结!