AI Personal Learning
and practical guidance
Beanbag Marscode1

Using the Ollama API in Java

This article describes how to use the Ollama This document is designed to help developers get up to speed quickly and take full advantage of Ollama's capabilities. You can call the Ollama API directly in your application, or you can call Ollama through Spring AI components.By studying this document, you can easily integrate Ollama into your projects.

 

I. Environmental preparedness

To use the Ollama API in Java, make sure you have the following environment and tools ready:

  • Java Development Kit (JDK) : Install JDK version 1.8 or later.
  • Building Tools : such as Maven or Gradle, for project dependency management.
  • HTTP Client Library : Choose a suitable HTTP client library, such as Apache HttpClient or OkHttp.

 

II. Direct use of Ollama

There are a lot of third-party developed components on github that make it easy to integrate Ollama in your application, here's an example with the Asedem For example, the following 3 steps can be followed (maven is used here for project management):

  1. Add ollama dependency in pom.xml


jitpack.io
https://jitpack.io
</repository
</repositories
<dependencies

com.github.Asedem
OllamaJavaAPI</artifactId
master-SNAPSHOT
</dependency
</dependencies
  1. Initialize Ollama
// By default, it will connect to localhost:11434
Ollama ollama = Ollama.initDefault();
// For custom values
Ollama ollama = Ollama.init("http://localhost", 11434);
  1. Using Ollama
  • dialogues
String model = "llama2:latest"; // Specify model
String prompt = "Why is the sky blue?" ; // Provide a prompt
GenerationResponse response = ollama.generate(new GenerationRequest(model, prompt));
// Print the generated response
System.out.println(response.response()); // print the generated response.
  • List Local Models
List models = ollama.listModels(); // Returns a list of Model objects.
  • Displaying model information
ModelInfo modelInfo = ollama.showInfo("ollama2:latest"); // return the ModelInfo object
  • Replication models
boolean success = ollama.copy("ollama2:latest", "ollama2-backup"); // true if the copying process was successful
  • Delete Model
boolean success = ollama.delete("ollama2-backup"); // true if deletion was successful

 

Calling Ollama with Spring AI

Introduction to Spring AI

Spring AI is an application framework designed for artificial intelligence engineering. The core features are listed below:

  • API support across AI providers: Spring AI provides a portable set of APIs that support interaction with chat, text-to-image, and embedded models from multiple AI service providers.
  • Synchronized and Streaming API options: The framework supports synchronized and streaming APIs, providing developers with flexible interaction methods.
  • Model-Specific Function Access: Allows developers to access model-specific functions via configuration parameters, providing more granular control.

Using Spring AI

  1. Add Spring AI dependency in pom.xml


io.springboot.ai
spring-ai-ollama-spring-boot-starter
1.0.3
</dependency
</dependencies

 


Note: When using IDEA to create a project, you can directly specify the dependencies, the system automatically completes the pom.xml file, you do not need to manually modify, as shown in the following figure:

Using the Ollama API in Java-1

 

  1. Add the configuration for Spring AI and Ollama to the configuration file of your Spring Boot application. Example:
ai.
ollama.
base-url: http://localhost:11434
chat: ollama: base-url:
options: ollama: base-url:  chat.
model: llama3.1:latest
  1. Use Ollama for text generation or dialog:

First create a Spring Boot controller to call the Ollama API:

import jakarta.annotation.Resource; import org.springframework.ai.chat.model.
import org.springframework.ai.chat.model.ChatResponse; import org.springframework.ai.chat.model.
import org.springframework.ai.chat.prompt.
import org.springframework.ai.ollama.OllamaChatModel; import org.springframework.ai.ollama.
import org.springframework.ai.ollama.api.OllamaOptions; import org.springframework.ai.ollama.api.
import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.
import org.springframework.web.bind.annotation.RequestParam; import org.springframework.web.bind.annotation.
import org.springframework.web.bind.annotation.RestController; import org.springframework.web.bind.annotation.
@RestController
public class OllamaController {
@Resource
private OllamaChatModel ollamaChatModel; @RequestMapping(value); @Resource
@RequestMapping(value = "/ai/ollama")
public Object ollama(@RequestParam(value = "msg") String msg){
ChatResponse chatResponse=ollamaChatModel.call(new Prompt(msg, OllamaOptions.create())
.withModel("llama3.1:latest")//specify which big model to use
.withTemperature(0.5F)))));
System.out.println(chatResponse.getResult().getOutput().getContent());
return chatResponse.getResult().getOutput().getContent();
}
}

Then run the program and enter the URL in your browser http://localhost:8080/ai/ollama?msg= "Cue word" The result is shown below:

Using the Ollama API in Java-2

 

reference document

CDN1
May not be reproduced without permission:Chief AI Sharing Circle " Using the Ollama API in Java

Chief AI Sharing Circle

Chief AI Sharing Circle specializes in AI learning, providing comprehensive AI learning content, AI tools and hands-on guidance. Our goal is to help users master AI technology and explore the unlimited potential of AI together through high-quality content and practical experience sharing. Whether you are an AI beginner or a senior expert, this is the ideal place for you to gain knowledge, improve your skills and realize innovation.

Contact Us
en_USEnglish