Documentation
¶
Index ¶
- Variables
- func BuildHTTPClient(opts *LLMOptions, timeout time.Duration) *http.Client
- func BuildTransport(opts *LLMOptions) http.RoundTripper
- func CheckOllama(baseURL string) error
- func NewAnthropic(_ context.Context, modelName, apiKey, baseURL, thinkingLevel string, ...) (model.LLM, error)
- func NewGemini(ctx context.Context, modelName, baseURL string, opts *LLMOptions) (model.LLM, error)
- func NewLLM(ctx context.Context, info Info, apiKey, baseURL, thinkingLevel string, ...) (model.LLM, error)
- func NewOllama(_ context.Context, modelName, baseURL, thinkingLevel string, opts *LLMOptions) (model.LLM, error)
- func NewOpenAI(_ context.Context, modelName, apiKey, baseURL string, llmOpts *LLMOptions) (model.LLM, error)
- func OllamaListModels(ctx context.Context, baseURL string) ([]string, error)
- type Info
- type LLMOptions
Constants ¶
This section is empty.
Variables ¶
var OllamaModelPrefixes = []string{"qwen", "minimax", "deepseek", "llama", "mistral", "phi", "codellama", "gemma"}
OllamaModelPrefixes are common Ollama model name prefixes.
Functions ¶
func BuildHTTPClient ¶ added in v0.0.9
func BuildHTTPClient(opts *LLMOptions, timeout time.Duration) *http.Client
BuildHTTPClient creates an *http.Client with optional TLS skip, extra headers, and timeout. Returns a default client if no customization is needed.
func BuildTransport ¶ added in v0.0.9
func BuildTransport(opts *LLMOptions) http.RoundTripper
BuildTransport creates an http.Transport with optional TLS skip and extra headers. Returns nil if no customization is needed.
func CheckOllama ¶ added in v0.0.4
CheckOllama verifies that the Ollama server at baseURL is reachable. It first checks TCP connectivity on the port, then issues a GET to the root endpoint (Ollama returns "Ollama is running").
func NewAnthropic ¶
func NewAnthropic(_ context.Context, modelName, apiKey, baseURL, thinkingLevel string, llmOpts *LLMOptions) (model.LLM, error)
NewAnthropic creates an Anthropic model.LLM. If baseURL is non-empty, it overrides the default API endpoint. When baseURL is set, the API key is optional (for Ollama compatibility). thinkingLevel controls extended thinking: "none", "low", "medium", "high".
func NewGemini ¶
NewGemini creates a Gemini model.LLM using ADK Go's native Gemini support. It reads the API key from GOOGLE_API_KEY or GEMINI_API_KEY env vars. If neither is set, it falls back to Application Default Credentials. If baseURL is non-empty, it overrides the default API endpoint.
func NewLLM ¶
func NewLLM(ctx context.Context, info Info, apiKey, baseURL, thinkingLevel string, opts *LLMOptions) (model.LLM, error)
NewLLM creates a model.LLM for the given provider info, API key, optional base URL, thinking level, and options.
func NewOllama ¶ added in v0.0.7
func NewOllama(_ context.Context, modelName, baseURL, thinkingLevel string, opts *LLMOptions) (model.LLM, error)
NewOllama creates an Ollama model.LLM using the native Ollama Go client. baseURL defaults to http://localhost:11434 if empty. thinkingLevel controls extended thinking: "none", "low", "medium", "high".
Types ¶
type Info ¶
type Info struct {
Provider string
Model string
Ollama bool // true when model is served by Ollama
}
Info describes a provider and the model to use.
type LLMOptions ¶ added in v0.0.9
LLMOptions holds optional configuration for LLM provider creation.