Documentation
¶
Index ¶
- Variables
- func BuildHTTPClient(opts *LLMOptions, timeout time.Duration) *http.Client
- func BuildTransport(opts *LLMOptions) http.RoundTripper
- func CheckOllama(baseURL string) error
- func ContextWindowSize(modelName string) int64
- func NewAnthropic(_ context.Context, modelName, apiKey, baseURL, thinkingLevel string, ...) (model.LLM, error)
- func NewAzureOpenAI(_ context.Context, deploymentName, apiKey, endpoint, apiVersion string, ...) (model.LLM, error)
- func NewGemini(ctx context.Context, modelName, baseURL string, opts *LLMOptions) (model.LLM, error)
- func NewLLM(ctx context.Context, info Info, apiKey, baseURL, thinkingLevel string, ...) (model.LLM, error)
- func NewMistral(_ context.Context, modelName, apiKey, baseURL string, llmOpts *LLMOptions) (model.LLM, error)
- func NewOllama(_ context.Context, modelName, baseURL, thinkingLevel string, opts *LLMOptions) (model.LLM, error)
- func NewOpenAI(_ context.Context, modelName, apiKey, baseURL string, llmOpts *LLMOptions) (model.LLM, error)
- func OllamaListModels(ctx context.Context, baseURL string) ([]string, error)
- func ValidateModel(info Info) error
- type Info
- type LLMOptions
Constants ¶
This section is empty.
Variables ¶
var KnownModels = map[string][]string{
"anthropic": {
"claude-3-opus-20240229",
"claude-3-opus-latest",
"claude-3-sonnet-20240229",
"claude-3-sonnet-latest",
"claude-3-haiku-20240307",
"claude-3-haiku-latest",
"claude-3-5-sonnet-20240620",
"claude-3-5-sonnet-20241022",
"claude-3-5-sonnet-latest",
"claude-3-5-haiku-latest",
"claude-3-7-sonnet-20250219",
"claude-3-7-sonnet-latest",
"claude-opus-4-0",
"claude-sonnet-4-0",
"claude-opus-4-1-20250805",
"claude-sonnet-4-5",
"claude-haiku-4-5-20251001",
"claude-haiku-4-5",
"claude-opus-4-5-20251101",
"claude-opus-4-6",
"claude-sonnet-4-6",
"claude-opus-4-7",
},
"openai": {
"gpt-5.5",
"gpt-5.4-pro",
"gpt-5.4",
"gpt-5.4-mini",
"gpt-5.4-nano",
"gpt-5.3-codex",
"gpt-5.2-codex",
"gpt-5.1-codex",
"gpt-5.1-codex-mini",
"gpt-5.1-codex-max",
},
"gemini": {
"gemini-3.1-pro-preview",
"gemini-3.1-pro-preview-customtools",
"gemini-3-flash-preview",
"gemini-3.1-flash-lite-preview",
"gemini-2.5-pro",
"gemini-2.5-flash",
"gemini-2.5-flash-lite",
},
"mistral": {
"mistral-large-2512",
"mistral-large-latest",
"mistral-medium-2508",
"mistral-medium-latest",
"mistral-small-2603",
"mistral-small-latest",
"codestral",
"pixtral",
"ministral",
},
}
KnownModels lists recognized model names per provider. The check is prefix-based: a model is valid if it starts with any entry. Ollama models are not validated here (they are dynamic).
April 2026 source snapshot and update path:
- OpenAI: https://developers.openai.com/api/docs/models
- Anthropic: https://platform.claude.com/docs/en/about-claude/models/overview
- Gemini: https://ai.google.dev/gemini-api/docs/models
- Mistral: https://docs.mistral.ai/getting-started/models
To update: re-check those official model pages, keep the top current max/large/medium/small or fast tier IDs per provider, and update contextWindowSizes in the same change.
var OllamaModelPrefixes = []string{"qwen", "minimax", "deepseek", "llama", "phi", "codellama", "gemma"}
OllamaModelPrefixes are common Ollama model name prefixes.
Functions ¶
func BuildHTTPClient ¶ added in v0.0.9
func BuildHTTPClient(opts *LLMOptions, timeout time.Duration) *http.Client
BuildHTTPClient creates an *http.Client with optional TLS skip, extra headers, and timeout. Returns a default client if no customization is needed.
func BuildTransport ¶ added in v0.0.9
func BuildTransport(opts *LLMOptions) http.RoundTripper
BuildTransport creates an http.Transport with optional TLS skip and extra headers. Returns nil if no customization is needed.
func CheckOllama ¶ added in v0.0.4
CheckOllama verifies that the Ollama server at baseURL is reachable. It first checks TCP connectivity on the port, then issues a GET to the root endpoint (Ollama returns "Ollama is running").
func ContextWindowSize ¶ added in v0.0.18
ContextWindowSize returns the context window size for a model (in tokens). Returns 0 if the model is unknown. For Ollama models, returns 0 (unknown).
func NewAnthropic ¶
func NewAnthropic(_ context.Context, modelName, apiKey, baseURL, thinkingLevel string, llmOpts *LLMOptions) (model.LLM, error)
NewAnthropic creates an Anthropic model.LLM. If baseURL is non-empty, it overrides the default API endpoint. When baseURL is set, the API key is optional (for Ollama compatibility). thinkingLevel controls extended thinking: "none", "low", "medium", "high". llmOpts.AdvisorModel enables the advisor tool (beta). See specs/features/LLM/001-claude-models/advisor-tool.md
func NewAzureOpenAI ¶ added in v0.0.18
func NewAzureOpenAI(_ context.Context, deploymentName, apiKey, endpoint, apiVersion string, llmOpts *LLMOptions) (model.LLM, error)
NewAzureOpenAI creates an Azure OpenAI model.LLM. It uses AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, and OPENAI_API_VERSION (defaults to 2025-04-01-preview). The deploymentName is the Azure deployment name (not the model ID).
func NewGemini ¶
NewGemini creates a Gemini model.LLM using ADK Go's native Gemini support. It reads the API key from GEMINI_API_KEY or GOOGLE_API_KEY env vars. If neither is set, it falls back to Application Default Credentials. If baseURL is non-empty, it overrides the default API endpoint.
func NewLLM ¶
func NewLLM(ctx context.Context, info Info, apiKey, baseURL, thinkingLevel string, opts *LLMOptions) (model.LLM, error)
NewLLM creates a model.LLM for the given provider info, API key, optional base URL, thinking level, and options.
func NewMistral ¶ added in v0.0.18
func NewMistral(_ context.Context, modelName, apiKey, baseURL string, llmOpts *LLMOptions) (model.LLM, error)
NewMistral creates a Mistral model.LLM. If baseURL is empty, the default Mistral API endpoint is used.
func NewOllama ¶ added in v0.0.7
func NewOllama(_ context.Context, modelName, baseURL, thinkingLevel string, opts *LLMOptions) (model.LLM, error)
NewOllama creates an Ollama model.LLM using the native Ollama Go client. baseURL defaults to http://localhost:11434 if empty. thinkingLevel controls extended thinking: "none", "low", "medium", "high".
func NewOpenAI ¶
func NewOpenAI(_ context.Context, modelName, apiKey, baseURL string, llmOpts *LLMOptions) (model.LLM, error)
NewOpenAI creates an OpenAI model.LLM. If baseURL is non-empty, it overrides the default API endpoint. When apiKey is recognized as a codex ChatGPT OAuth token and no explicit baseURL is provided, the client is pointed at the ChatGPT backend (/codex/responses) and the `chatgpt-account-id` + `originator` headers pi-mono sends are injected — the platform /v1/responses endpoint rejects codex tokens with 401 "Missing scopes: api.responses.write".
func OllamaListModels ¶ added in v0.0.7
OllamaListModels lists available models from the Ollama server.
func ValidateModel ¶ added in v0.0.18
ValidateModel checks whether the model name is recognized for its provider. Returns an error with suggestions if the model is unknown. Ollama models are always considered valid (they are pulled dynamically).
Types ¶
type Info ¶
type Info struct {
Provider string
Model string
Ollama bool // true when model is served by Ollama
}
Info describes a provider and the model to use.
type LLMOptions ¶ added in v0.0.9
type LLMOptions struct {
ExtraHeaders map[string]string
InsecureSkipTLS bool
AdvisorModel string // Advisor model (e.g., "claude-opus-4-7")
AdvisorMaxUses int // Max advisor calls per request (0 = unlimited)
AdvisorCaching bool // Enable ephemeral prompt caching for advisor
}
LLMOptions holds optional configuration for LLM provider creation.