Documentation
¶
Overview ¶
Package llm provides a shared layer for calling LLM backends (OpenAI, Ollama, Gemini). Use it from Explore AI, the AI response module, or any new module that needs to call an LLM.
Example (new module or one-off call):
cfg := llm.Config{
Provider: llm.ProviderOpenAI,
MaxTokens: 150,
}
key, err := llm.GetAPIKey(cfg.Provider)
if err != nil {
return err // or report "skipped: " + err.Error()
}
content, err := llm.Call(ctx, cfg, key, []llm.Message{{Role: "user", Content: prompt}})
Index ¶
Constants ¶
const ( DefaultMaxTokensOpenAI = 500 DefaultMaxTokensOllama = 500 DefaultMaxTokensGemini = 500 )
Default max tokens per provider when Config.MaxTokens <= 0.
const DefaultTimeout = 20 * time.Second
DefaultTimeout is the default timeout for API calls.
Variables ¶
var ErrMissingAPIKey = errors.New("llm: API key required but not set")
ErrMissingAPIKey is returned by GetAPIKey when the provider requires an API key but none is set.
Functions ¶
func Call ¶
Call sends the messages to the configured provider and returns the assistant's reply content. apiKey must be set for OpenAI and Gemini; for Ollama it can be empty.
func GetAPIKey ¶
GetAPIKey returns the API key for the given provider from the environment. For OpenAI: OPENAI_API_KEY; for Gemini: GEMINI_API_KEY or GOOGLE_API_KEY; for Ollama: OLLAMA_API_KEY (optional). Returns ErrMissingAPIKey when the provider requires a key and it is not set.
func IsMissingKeyError ¶
IsMissingKeyError returns true if err indicates the API key was not set.
Types ¶
type Config ¶
type Config struct {
Provider Provider
Endpoint string // optional; e.g. http://localhost:11434 for Ollama
Model string // optional; e.g. gpt-4o-mini, llama3.1, gemini-1.5-flash
MaxTokens int // max tokens to generate; if <= 0, provider default is used
Timeout time.Duration
}
Config configures an LLM call. Endpoint and Model can be empty to use built-in defaults.
type Provider ¶
type Provider string
Provider identifies the LLM backend.
func NormalizeProviderFromString ¶
NormalizeProviderFromString normalizes a provider string (trim, lower; empty -> openai) and returns the Provider. Use this when reading provider from config (e.g. in modules) to avoid duplicating normalization logic.