Springboot整合ollama运行本地AI大模型

发布于:2025-06-18 ⋅ 阅读:(21) ⋅ 点赞:(0)

1.安装启动ollama

ollama部署开源大模型-CSDN博客

我的这篇文章已经详细阐述了,这里就不介绍了

2.创建SpringBoot项目

照着我图片上的来即可 JKD要》=17 因为我们要使用Springboot3.x.x只有JDK17及以上才支持

1.导入pom依赖

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>3.4.6</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>
    <groupId>org.example</groupId>
    <artifactId>springboot-ai</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>springboot-ai</name>
    <description>springboot-ai</description>
    <properties>
        <java.version>17</java.version>
    </properties>


    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>

        <!-- Spring Boot DevTools (Optional for auto-reloading during development) -->
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-devtools</artifactId>
            <scope>runtime</scope>
        </dependency>

        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <version>1.18.34</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.ai</groupId>
            <artifactId>spring-ai-ollama</artifactId>
            <version>1.0.0</version>
        </dependency>

    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>

2.在resources中创建application.yml文件并且编辑

把默认的application.properties文件删除,然后下面是yml的配置内容 

spring:
  ai:
    ollama:
      base-url: http://localhost:11434
      chat:
        option:
          model: gemma3:4b
          temperature: 0.7  # 生成随机性(0~1,越大越随机)
          max-tokens: 512   # 最大生成 Token 数
server:
  port: 8080

3.创建AIconfig配置文件

package com.example.demo.config;

import io.micrometer.observation.ObservationRegistry;
import org.springframework.ai.model.tool.ToolCallingManager;
import org.springframework.ai.ollama.OllamaChatModel;
import org.springframework.ai.ollama.api.OllamaApi;
import org.springframework.ai.ollama.api.OllamaOptions;
import org.springframework.ai.ollama.management.ModelManagementOptions;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class AiConfig {

    @Bean
    public OllamaChatModel ollamaChatModel() {
        // 1. 创建 OllamaApi(与 Ollama 服务通信)
        OllamaApi ollamaApi = OllamaApi.builder()
                .baseUrl("http://localhost:11434")
                .build();
        // 2. 配置默认参数
        OllamaOptions defaultOptions = OllamaOptions.builder()
                .model("gemma3:4b")
                .temperature(0.7)
                .build();

        // 3. 构建 ToolCallingManager(支持工具调用)
        ToolCallingManager toolCallingManager = ToolCallingManager.builder().build();

        // 4. 构建 ObservationRegistry(监控)
        ObservationRegistry observationRegistry = ObservationRegistry.NOOP;

        // 5. 模型管理配置
        ModelManagementOptions modelManagementOptions = ModelManagementOptions.defaults();

        // 6. 最终创建 OllamaChatModel
        return new OllamaChatModel(
                ollamaApi,
                defaultOptions,
                toolCallingManager,
                observationRegistry,
                modelManagementOptions
        );
    }
}

4.创建AIController 

package com.example.demo.controller;

import org.springframework.ai.chat.prompt.Prompt;
import org.springframework.ai.ollama.OllamaChatModel;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class AiController {

    // 注入 OllamaChatModel,用于调用 Ollama 模型
    @Autowired
    private OllamaChatModel ollamaChatModel;

    /**
     * 简单文本对话接口(直接传字符串消息)
     * @param msg 用户输入的消息
     * @return 模型返回的文本内容
     */
    @GetMapping("/ollama/chat/msg")
    public String ollamaChat(@RequestParam String msg) {
        return ollamaChatModel.call(msg);
    }

    /**
     * 基于 Prompt 对象的对话接口
     * @param msg 用户输入的消息
     * @return 模型返回的结果(Object 类型,通常是包含详细信息的响应)
     */
    @GetMapping("/ollama/chat/prompt")
    public Object ollamaChatV2(@RequestParam String msg) {
        Prompt prompt = new Prompt(msg);
        return ollamaChatModel.call(prompt);
    }


}

5. 编辑启动类(可以省略)

package dbcs.springai01;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class SpringAi01Application {

    public static void main(String[] args) {
        SpringApplication.run(SpringAi01Application.class, args);
        System.out.println("http://localhost:8080/ollama/chat/msg?msg=你好");
    }

}

6.项目启动测试 

1.先要保证自己的本地的ollama是启动了的

2.然后我们启动Springboot项目

3.最后我们浏览器访问

http://localhost:8080/ollama/chat/msg?msg=这里写上你要问的问题

例如: http://localhost:8080/ollama/chat/msg?msg=你好

这样就是使用最简单的方式使用Springboot整合了ollama的本地AI大模型了

 


网站公告

今日签到

点亮在社区的每一天
去签到