provider

package
v0.0.7 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 18, 2026 License: MIT Imports: 23 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var OllamaModelPrefixes = []string{"qwen", "minimax", "deepseek", "llama", "mistral", "phi", "codellama", "gemma"}

OllamaModelPrefixes are common Ollama model name prefixes.

Functions

func CheckOllama added in v0.0.4

func CheckOllama(baseURL string) error

CheckOllama verifies that the Ollama server at baseURL is reachable. It first checks TCP connectivity on the port, then issues a GET to the root endpoint (Ollama returns "Ollama is running").

func NewAnthropic

func NewAnthropic(_ context.Context, modelName, apiKey, baseURL, thinkingLevel string) (model.LLM, error)

NewAnthropic creates an Anthropic model.LLM. If baseURL is non-empty, it overrides the default API endpoint. When baseURL is set, the API key is optional (for Ollama compatibility). thinkingLevel controls extended thinking: "none", "low", "medium", "high".

func NewGemini

func NewGemini(ctx context.Context, modelName, baseURL string) (model.LLM, error)

NewGemini creates a Gemini model.LLM using ADK Go's native Gemini support. It reads the API key from GOOGLE_API_KEY or GEMINI_API_KEY env vars. If neither is set, it falls back to Application Default Credentials. If baseURL is non-empty, it overrides the default API endpoint.

func NewLLM

func NewLLM(ctx context.Context, info Info, apiKey, baseURL, thinkingLevel string) (model.LLM, error)

NewLLM creates a model.LLM for the given provider info, API key, optional base URL, and thinking level.

func NewOllama added in v0.0.7

func NewOllama(_ context.Context, modelName, baseURL, thinkingLevel string) (model.LLM, error)

NewOllama creates an Ollama model.LLM using the native Ollama Go client. baseURL defaults to http://localhost:11434 if empty. thinkingLevel controls extended thinking: "none", "low", "medium", "high".

func NewOpenAI

func NewOpenAI(_ context.Context, modelName, apiKey, baseURL string) (model.LLM, error)

NewOpenAI creates an OpenAI model.LLM. If baseURL is non-empty, it overrides the default API endpoint.

func OllamaListModels added in v0.0.7

func OllamaListModels(ctx context.Context, baseURL string) ([]string, error)

OllamaListModels lists available models from the Ollama server.

Types

type Info

type Info struct {
	Provider string
	Model    string
	Ollama   bool // true when model is served by Ollama
}

Info describes a provider and the model to use.

func Resolve

func Resolve(modelName string) (Info, error)

Resolve determines the provider from a model name. Ollama models are routed to the native "ollama" provider.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL