Ollama Java Library

Ollama 快速入门:Ollama Java Library

Spring-ai-ollama 提供了将 Java 项目与 Ollama 集成的最简单方法。

添加项目依赖

spring-ai-ollama 依赖项添加到项目的 Maven pom.xml文件中:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-ollama</artifactId>
</dependency>

或者,在你的 Gradle 构建文件 build.gradle 中添加:

dependencies {
    implementation 'org.springframework.ai:spring-ai-ollama'
}

用例

接下来,创建一个 OllamaChatClient 实例并将其用于文本生成请求:

var ollamaApi = new OllamaApi();

var chatClient = new OllamaChatClient(ollamaApi).withModel(MODEL)
        .withDefaultOptions(OllamaOptions.create()
                .withModel(OllamaOptions.DEFAULT_MODEL)
                .withTemperature(0.9f));

ChatResponse response = chatClient.call(
    new Prompt("Generate the names of 5 famous pirates.")); 

流式响应

可以通过设置 stream=True、修改函数调用来启用响应流。

var ollamaApi = new OllamaApi();

// Streaming request
var request = ChatRequest.builder("orca-mini")
    .withStream(true) // streaming
    .withMessages(List.of(Message.builder(Role.USER)
        .withContent("What is the capital of Bulgaria and what is the size? " + "What it the national anthem?")
        .build()))
    .withOptions(OllamaOptions.create().withTemperature(0.9f).toMap())
    .build();

Flux<ChatResponse> streamingResponse = ollamaApi.streamingChat(request);

Comments

No comments yet. Why don’t you start the discussion?

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注