AI Personal Learning
and practical guidance
CyberKnife Drawing Mirror

Using the Ollama API in Golang

This article describes how to use the Ollama This document is designed to help developers get up to speed and take full advantage of Ollama's capabilities. Ollama itself is developed in the Golang language, and the interface code for the Golang language version is available in the official repository directory at https://github.com/ollama/ollama/tree/main/api, and in the official documentation at The official documentation is available at https://pkg.go.dev/github.com/ollama/ollama/api. By studying this document, you can easily integrate Ollama into your projects.

The official repository https://github.com/ollama/ollama/tree/main/api/examples提供了一些示例代 code, and the code below references and modifies these examples. All examples are available in thenotebook/C4/Golang_API_examplecenter

 

environmental preparation

Before you start, make sure your development environment meets the following conditions:

  1. Golang development environment through thego versionCheck the golang version, the version used in this article isgo1.23.6

    Reference https://golang.google.cn/doc/install进行安装

  2. Create project directory and initialize it
mkdir ollama-demo && cd ollama-demo
go mod init ollama-demo
  1. Installation of dependencies
go get github.com/ollama/ollama/api

 

Usage

Create a catalogchat, create a file in the directorymain.go, which reads as follows (the example uses the deepseek-r1:7b model):

package main
import (
"context"
"fmt"
"log"
"github.com/ollama/ollama/api"
)
func main() {
client, err := api.ClientFromEnvironment()
if err ! = nil {
log.Fatal(err)
Fatal(err) }
messages := []api.Message{
api.Message{
Role: "user",
Content: "Why is the sky blue?" ,
},
}
ctx := context.Background()
req := &api.ChatRequest{
Model: "deepseek-r1:7b".
Messages: messages, }
}
respFunc := func(resp api.ChatResponse) error {
fmt.Print(resp.Message.Content)
Content) return nil
}
err = client.Chat(ctx, req, respFunc)
if err ! = nil {
log.Fatal(err)
}
}

(of a computer) run go run chat/main.go

 

streaming output

Create a cataloggenerate-streaming, create a file in the directorymain.go

package main
import (
"context"
"fmt"
"log"
"github.com/ollama/ollama/api"
)
func main() {
client, err := api.ClientFromEnvironment()
if err ! = nil {
log.Fatal(err)
Fatal(err) }
// By default, GenerateRequest is streaming.
req := &api.GenerateRequest{
Model: "deepseek-r1:7b",
Prompt: "Why is the sky blue?" ,
}
ctx := context.Background()
respFunc := func(resp api.GenerateResponse) error {
// Only print the response here; GenerateResponse has a number of other // interesting fields you want to examine.
GenerateResponse has a number of other // interesting fields you want to examine.
// In streaming mode, responses are partial so we call fmt.Print (and not
// Print (and not Println) in order to avoid spurious newlines being introduced.
// model will insert its own newlines if it wants.
The // model will insert its own newlines if it wants. fmt.Print(resp. Response)
fmt.Print(resp. Response) return nil
fmt.Print(resp. Response) return nil }
err = client.Generate(ctx, req, respFunc)
if err ! = nil {
log.Fatal(err)
}
fmt.Println()
}

(of a computer) run go run generate-streaming/main.go


 

Structured Output

Create a catalogstructured_output, create a file in the directorymain.goThe following

package main
import (
"context"
"encoding/json"
"fmt"
"log"
"strings"
"github.com/ollama/ollama/api"
)
type CountryInfo struct {
Capital string `json: "capital"`
Population float64 `json: "population"`
Area float64 `json: "area"`
}
func main() {
client, err := api.ClientFromEnvironment()
if err ! = nil {
log.Fatal(err)
Fatal(err) }
messages := []api.Message{
api.Message{
Role: "user",
Content: "Please present information about the capital, population, and land area of the United States and return it in JSON format." ,
}
}
ctx := context.Background()
req := &api.ChatRequest{
Model: "deepseek-r1:7b",
Messages: messages, }
Stream: new(bool), // false
Format: []byte(`"json"`), // false
Options: map[string]interface{}{
"temperature": 0.0, // false
}, }
}
respFunc := func(resp api.ChatResponse) error {
fmt.Printf("%s\n", strings.TrimSpace(resp.Message.Content))
var info CountryInfo
err := json.Unmarshal([]byte(resp.Message.Content), &info)
if err ! = nil {
log.Fatal(err)
}
fmt.Printf("Capital: %s, Population: %f, Area: %f", info.Capital, info.Population, info.Area)
return nil
}
err = client.Chat(ctx, req, respFunc)
if err ! = nil {
log.Fatal(err)
}
}

By specifyingChatRequestdemandingFormat: []byte("json"), parameter, which allows the model to return structured data, which can then be passed through thejson.UnmarshalParses the returned data.

(of a computer) run go run structured_output/main.go The following output can be obtained:

{"capital": "Washington, D.C.", "population": 3.672e6, "area": 7.058e6}
Capital: Washington, D.C., population: 3672000.000000, area: 7058000.000000
CDN1
May not be reproduced without permission:Chief AI Sharing Circle " Using the Ollama API in Golang

Chief AI Sharing Circle

Chief AI Sharing Circle specializes in AI learning, providing comprehensive AI learning content, AI tools and hands-on guidance. Our goal is to help users master AI technology and explore the unlimited potential of AI together through high-quality content and practical experience sharing. Whether you are an AI beginner or a senior expert, this is the ideal place for you to gain knowledge, improve your skills and realize innovation.

Contact Us
en_USEnglish