Documentation
¶
Index ¶
- Variables
- func CheckOllama(baseURL string) error
- func NewAnthropic(_ context.Context, modelName, apiKey, baseURL, thinkingLevel string, ...) (model.LLM, error)
- func NewGemini(ctx context.Context, modelName, baseURL string, extraHeaders map[string]string) (model.LLM, error)
- func NewLLM(ctx context.Context, info Info, apiKey, baseURL, thinkingLevel string, ...) (model.LLM, error)
- func NewOllama(_ context.Context, modelName, baseURL, thinkingLevel string, ...) (model.LLM, error)
- func NewOpenAI(_ context.Context, modelName, apiKey, baseURL string, ...) (model.LLM, error)
- func OllamaListModels(ctx context.Context, baseURL string) ([]string, error)
- type Info
Constants ¶
This section is empty.
Variables ¶
var OllamaModelPrefixes = []string{"qwen", "minimax", "deepseek", "llama", "mistral", "phi", "codellama", "gemma"}
OllamaModelPrefixes are common Ollama model name prefixes.
Functions ¶
func CheckOllama ¶ added in v0.0.4
CheckOllama verifies that the Ollama server at baseURL is reachable. It first checks TCP connectivity on the port, then issues a GET to the root endpoint (Ollama returns "Ollama is running").
func NewAnthropic ¶
func NewAnthropic(_ context.Context, modelName, apiKey, baseURL, thinkingLevel string, extraHeaders map[string]string) (model.LLM, error)
NewAnthropic creates an Anthropic model.LLM. If baseURL is non-empty, it overrides the default API endpoint. When baseURL is set, the API key is optional (for Ollama compatibility). thinkingLevel controls extended thinking: "none", "low", "medium", "high".
func NewGemini ¶
func NewGemini(ctx context.Context, modelName, baseURL string, extraHeaders map[string]string) (model.LLM, error)
NewGemini creates a Gemini model.LLM using ADK Go's native Gemini support. It reads the API key from GOOGLE_API_KEY or GEMINI_API_KEY env vars. If neither is set, it falls back to Application Default Credentials. If baseURL is non-empty, it overrides the default API endpoint.
func NewLLM ¶
func NewLLM(ctx context.Context, info Info, apiKey, baseURL, thinkingLevel string, extraHeaders map[string]string) (model.LLM, error)
NewLLM creates a model.LLM for the given provider info, API key, optional base URL, thinking level, and extra headers.
func NewOllama ¶ added in v0.0.7
func NewOllama(_ context.Context, modelName, baseURL, thinkingLevel string, extraHeaders map[string]string) (model.LLM, error)
NewOllama creates an Ollama model.LLM using the native Ollama Go client. baseURL defaults to http://localhost:11434 if empty. thinkingLevel controls extended thinking: "none", "low", "medium", "high".