基于LM Studio + LLaMA3 建立本地化的ChatGPT

发布于:2024-04-30 ⋅ 阅读:(53) ⋅ 点赞:(0)

4月19日,Facebook母公司Meta重磅推出了Llama3。即便大家现在对于大厂和巨头频繁迭代AI模型的行为已经见怪不怪,Meta的Llama3仍旧显得与众不同,因为这是迄今最强大的开源AI模型。LLaMA模型通常采用了类似于GPT(由OpenAI开发)的变换器(Transformer)架构。这种架构特别适合处理大量的自然语言数据,并能有效地学习语言的深层结构和上下文。结合LM Studio,我们就可以将LLaMA3部署在本地服务器,广泛的应用客户服务、RAG等领域。下面是一个详细的动手实践操作供大家参考。

LM Studio的下载和安装可参考以下链接:

用LM Studio:2分钟在本地免费部署大语言模型,替代ChatGPT-CSDN博客

一、在LM Studio上下载 LLaMA3 

当启动LM Studio的时候,系统会提示升级,升级完毕后。如下图:

主页上可以看到最新支持的LLaMA3的模型,点击下载按钮直接下载 (注意这里要通过魔法才行下载模型)

 下载后的模型在My Models这里可以看到

 二、启动Local Server

选择LLama3 模型,选择 Start Server 按钮

 Local Server启动之后如下图:

三、客户端访问和测试Local Server

 1、Python Code简单测试访问

下载Python 3.11 并安装openai Python 包

pip install -r requirements.txt

requirements.txt 内容如下:

 openai==0.28.0

 

import openai

# Set the base URL and API key for the OpenAI client
openai.api_base = "http://localhost:1234/v1"
openai.api_key = "not-needed"

# Create a chat completion
completion = openai.ChatCompletion.create(
    model="local-model",  # this field is currently unused
    messages=[
        {"role": "system", "content": "Provide detailed technical explanations."},
        {"role": "user", "content": "Introduce yourself."}
    ],
    temperature=0.7,
)

# Print the chatbot's response
print(completion.choices[0].message.content)

返回结果如下图:

 

2、填写系统提示词,测试交互式对话

import openai

# Configuration for OpenAI API
openai.api_base = "http://localhost:1234/v1"
openai.api_key = "not-needed"

# Function to create a chat completion with a dynamic user prompt
def create_chat_completion(user_input, system_message):
    return openai.ChatCompletion.create(
        model="local-model",
        messages=[
            {"role": "system", "content": system_message},
            {"role": "user", "content": user_input}
        ],
        temperature=0.7,
    )

def main():
    # 预定义的系统提示词 
    system_message = (
   "你是一位资深的小红书运营人员,你目前负责的内容方向是电子数码,你的任务是生成小红书的内容文案,要求分解长句,减少重复,语气轻松幽默,具有真题可读性。请用中文和用户对话"
)


    # Chat loop
    while True:
        user_input = input("User: ")
        if user_input.lower() in ['exit', 'bye', 'end']:
            print("Exiting the chat.")
            break

        completion = create_chat_completion(user_input, system_message)
        print("Model Response: ", completion.choices[0].message.content)

if __name__ == "__main__":
    main()

 执行Python code 的效果如下,虽然Llama3能够理解中文的输入,但是输出还是英文的。大家可以下载专门的针对汉语训练的LLama3的衍生版本试试看。 

Python 执行效果:

 后台Local Server日志:

 3、采用OpenAI的Python Code 供参考

import openai

# OpenAI API 配置
class OpenAIConfig:
    def __init__(self):
        self.base_url = "http://localhost:1234/v1"
        self.api_type = "open_ai"
        self.api_key = "not-needed"

# 将系统提示词存放在文本文件中加载进来
def read_file_content(file_path):
    try:
        with open(file_path, "r") as file:
            return file.read().strip()
    except FileNotFoundError:
        print(f"File not found: {file_path}")
        return None

# Function to initiate conversation with the local-model and establishes roles and where the instructions come from.
def initiate_conversation(input_text, system_message):
    response = openai.ChatCompletion.create(
        model="local-model",
        messages=[
            {"role": "system", "content": system_message},
            {"role": "user", "content": input_text}
        ],
        temperature=0.7,
    )
    return response.choices[0].message.content.strip()

def main():
    # Instantiate configuration
    config = OpenAIConfig()
    openai.api_base = config.base_url
    openai.api_key = config.api_key

    # Read system message from file
    system_message = read_file_content("my_prompt.txt")
    if system_message is None:
        return

    # Conversation loop
    while True:
        user_input = input("User: ")
        if user_input.lower() in ['exit', 'bye', 'end']:
            print("Exiting the conversation.")
            break

        model_response = initiate_conversation(user_input, system_message)
        print("Model Response: ", model_response)

if __name__ == "__main__":
    main()

以上就是结合LM Studio + LLaMA3 大模型在本地部署,提供Chat GPT功能的全过程,大家可以尝试一下。

 


网站公告

今日签到

点亮在社区的每一天
去签到