ollama

package
v1.3.30 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 4, 2026 License: Apache-2.0 Imports: 11 Imported by: 1

Documentation

Overview

Package ollama provides Ollama LLM provider integration for local development.

Index

Constants

View Source
const (
	// DefaultEmbeddingModel is the recommended model for general-purpose embeddings.
	DefaultEmbeddingModel = "nomic-embed-text"

	// EmbeddingModelNomicText is the default model (768 dimensions, ~270MB).
	EmbeddingModelNomicText = "nomic-embed-text"

	// EmbeddingModelMxbaiLarge is a larger model for higher quality (1024 dimensions).
	EmbeddingModelMxbaiLarge = "mxbai-embed-large"

	// EmbeddingModelAllMiniLM is a small, fast model (384 dimensions).
	EmbeddingModelAllMiniLM = "all-minilm"
)

Embedding model constants.

View Source
const DefaultOllamaURL = "http://localhost:11434"

DefaultOllamaURL is the default Ollama server URL.

Variables

This section is empty.

Functions

This section is empty.

Types

type EmbeddingOption added in v1.3.30

type EmbeddingOption func(*EmbeddingProvider)

EmbeddingOption configures the EmbeddingProvider.

func WithEmbeddingBaseURL added in v1.3.30

func WithEmbeddingBaseURL(url string) EmbeddingOption

WithEmbeddingBaseURL sets a custom Ollama server URL.

func WithEmbeddingDimensions added in v1.3.30

func WithEmbeddingDimensions(dims int) EmbeddingOption

WithEmbeddingDimensions overrides the default dimensions for the model. Use this for custom or fine-tuned models with non-standard dimensions.

func WithEmbeddingHTTPClient added in v1.3.30

func WithEmbeddingHTTPClient(client *http.Client) EmbeddingOption

WithEmbeddingHTTPClient sets a custom HTTP client.

func WithEmbeddingModel added in v1.3.30

func WithEmbeddingModel(model string) EmbeddingOption

WithEmbeddingModel sets the embedding model.

type EmbeddingProvider added in v1.3.30

type EmbeddingProvider struct {
	*providers.BaseEmbeddingProvider
}

EmbeddingProvider implements embedding generation via the Ollama API. No API key is needed — Ollama runs locally or on a private network.

func NewEmbeddingProvider added in v1.3.30

func NewEmbeddingProvider(opts ...EmbeddingOption) *EmbeddingProvider

NewEmbeddingProvider creates an Ollama embedding provider. Ollama runs locally — no API key is required.

func (*EmbeddingProvider) Embed added in v1.3.30

Embed generates embeddings for the given texts.

type Provider

type Provider struct {
	providers.BaseProvider
	// contains filtered or unexported fields
}

Provider implements the Provider interface for Ollama

func NewProvider

func NewProvider(
	id, model, baseURL string,
	defaults providers.ProviderDefaults,
	includeRawOutput bool,
	additionalConfig map[string]any,
) *Provider

NewProvider creates a new Ollama provider

func (*Provider) CalculateCost

func (p *Provider) CalculateCost(tokensIn, tokensOut, cachedTokens int) types.CostInfo

CalculateCost calculates cost breakdown - Ollama is free (local inference)

func (*Provider) GetMultimodalCapabilities

func (p *Provider) GetMultimodalCapabilities() providers.MultimodalCapabilities

GetMultimodalCapabilities returns Ollama's multimodal capabilities

func (*Provider) Model added in v1.1.8

func (p *Provider) Model() string

Model returns the model name/identifier used by this provider.

func (*Provider) Predict

Predict sends a predict request to Ollama

func (*Provider) PredictStream

func (p *Provider) PredictStream(
	ctx context.Context,
	req providers.PredictionRequest,
) (<-chan providers.StreamChunk, error)

PredictStream streams a predict response from Ollama

type ToolProvider

type ToolProvider struct {
	*Provider
}

ToolProvider extends Provider with tool support

func NewToolProvider

func NewToolProvider(
	id, model, baseURL string,
	defaults providers.ProviderDefaults,
	includeRawOutput bool,
	additionalConfig map[string]any,
) *ToolProvider

NewToolProvider creates a new Ollama provider with tool support

func (*ToolProvider) BuildTooling

func (p *ToolProvider) BuildTooling(descriptors []*providers.ToolDescriptor) (any, error)

BuildTooling converts tool descriptors to Ollama format

func (*ToolProvider) PredictStreamWithTools

func (p *ToolProvider) PredictStreamWithTools(
	ctx context.Context,
	req providers.PredictionRequest,
	tools any,
	toolChoice string,
) (<-chan providers.StreamChunk, error)

PredictStreamWithTools performs a streaming predict request with tool support

func (*ToolProvider) PredictWithTools

func (p *ToolProvider) PredictWithTools(
	ctx context.Context,
	req providers.PredictionRequest,
	tools any,
	toolChoice string,
) (providers.PredictionResponse, []types.MessageToolCall, error)

PredictWithTools performs a prediction request with tool support

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL