ollama

package
v1.2.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 15, 2026 License: Apache-2.0 Imports: 10 Imported by: 0

Documentation

Overview

Package ollama provides Ollama LLM provider integration for local development.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Provider

type Provider struct {
	providers.BaseProvider
	// contains filtered or unexported fields
}

Provider implements the Provider interface for Ollama

func NewProvider

func NewProvider(
	id, model, baseURL string,
	defaults providers.ProviderDefaults,
	includeRawOutput bool,
	additionalConfig map[string]any,
) *Provider

NewProvider creates a new Ollama provider

func (*Provider) CalculateCost

func (p *Provider) CalculateCost(tokensIn, tokensOut, cachedTokens int) types.CostInfo

CalculateCost calculates cost breakdown - Ollama is free (local inference)

func (*Provider) GetMultimodalCapabilities

func (p *Provider) GetMultimodalCapabilities() providers.MultimodalCapabilities

GetMultimodalCapabilities returns Ollama's multimodal capabilities

func (*Provider) Model added in v1.1.8

func (p *Provider) Model() string

Model returns the model name/identifier used by this provider.

func (*Provider) Predict

Predict sends a predict request to Ollama

func (*Provider) PredictMultimodal

PredictMultimodal performs a predict request with multimodal content

func (*Provider) PredictMultimodalStream

func (p *Provider) PredictMultimodalStream(
	ctx context.Context,
	req providers.PredictionRequest,
) (<-chan providers.StreamChunk, error)

PredictMultimodalStream performs a streaming predict request with multimodal content

func (*Provider) PredictStream

func (p *Provider) PredictStream(
	ctx context.Context,
	req providers.PredictionRequest,
) (<-chan providers.StreamChunk, error)

PredictStream streams a predict response from Ollama

type ToolProvider

type ToolProvider struct {
	*Provider
}

ToolProvider extends Provider with tool support

func NewToolProvider

func NewToolProvider(
	id, model, baseURL string,
	defaults providers.ProviderDefaults,
	includeRawOutput bool,
	additionalConfig map[string]any,
) *ToolProvider

NewToolProvider creates a new Ollama provider with tool support

func (*ToolProvider) BuildTooling

func (p *ToolProvider) BuildTooling(descriptors []*providers.ToolDescriptor) (any, error)

BuildTooling converts tool descriptors to Ollama format

func (*ToolProvider) PredictMultimodalWithTools

func (p *ToolProvider) PredictMultimodalWithTools(
	ctx context.Context,
	req providers.PredictionRequest,
	tools any,
	toolChoice string,
) (providers.PredictionResponse, []types.MessageToolCall, error)

PredictMultimodalWithTools implements providers.MultimodalToolSupport interface for ToolProvider This allows combining multimodal content (images) with tool calls in a single request

func (*ToolProvider) PredictStreamWithTools

func (p *ToolProvider) PredictStreamWithTools(
	ctx context.Context,
	req providers.PredictionRequest,
	tools any,
	toolChoice string,
) (<-chan providers.StreamChunk, error)

PredictStreamWithTools performs a streaming predict request with tool support

func (*ToolProvider) PredictWithTools

func (p *ToolProvider) PredictWithTools(
	ctx context.Context,
	req providers.PredictionRequest,
	tools any,
	toolChoice string,
) (providers.PredictionResponse, []types.MessageToolCall, error)

PredictWithTools performs a prediction request with tool support

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL