ollama

package
v0.1.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 12, 2026 License: MIT Imports: 10 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Client

type Client struct {
	// contains filtered or unexported fields
}

Client is an Ollama LLM client.

Ollama allows you to run large language models locally on your machine. This client connects to a local Ollama server (default: localhost:11434) and provides the same interface as cloud-based providers.

No API key is required. The client uses a longer default timeout (60s) since local models may take more time to generate responses.

func New

func New(config Config) (*Client, error)

New creates a new Ollama client

func (*Client) Chat

func (c *Client) Chat(ctx context.Context, messages []core.Message, opts ...core.Option) (*core.Response, error)

Chat performs a non-streaming chat completion

func (*Client) ChatStream

func (c *Client) ChatStream(ctx context.Context, messages []core.Message, opts ...core.Option) (*core.Stream, error)

ChatStream performs a streaming chat completion

type Config

type Config struct {
	base.Config
}

Config defines configuration for the Ollama client.

Ollama is for running LLMs locally. It doesn't require an API key.

Example:

config := ollama.Config{
    Config: base.Config{
        Model:   "llama2",
        BaseURL: "http://localhost:11434",
    },
}
client, err := ollama.New(config)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL