Documentation
¶
Index ¶
Constants ¶
View Source
const ( DefaultBaseURL = "http://localhost:11434/v1" DefaultModel = "qwen3" DefaultCacheTTLDays = 30 )
Variables ¶
This section is empty.
Functions ¶
func ExampleTOML ¶
func ExampleTOML() string
ExampleTOML returns a commented config file suitable for writing as a starter config. Not written automatically -- offered to the user on demand.
Types ¶
type Config ¶
Config is the top-level application configuration, loaded from a TOML file.
func Load ¶
Load reads the TOML config file from the default path if it exists, falls back to defaults for any unset fields, and applies environment variable overrides last.
func LoadFromPath ¶
LoadFromPath reads the TOML config file at the given path if it exists, falls back to defaults for any unset fields, and applies environment variable overrides last.
type Documents ¶ added in v1.27.0
type Documents struct {
// MaxFileSize is the largest file (in bytes) that can be imported as a
// document attachment. Default: 50 MiB.
MaxFileSize int64 `toml:"max_file_size"`
// CacheTTLDays is the number of days an extracted document cache entry
// is kept before being evicted on the next startup. Set to 0 to disable
// eviction. Default: 30.
CacheTTLDays int `toml:"cache_ttl_days"`
}
Documents holds settings for document attachments.
type LLM ¶
type LLM struct {
// BaseURL is the root of an OpenAI-compatible API.
// The client appends /chat/completions, /models, etc.
// Default: http://localhost:11434/v1 (Ollama)
BaseURL string `toml:"base_url"`
// Model is the model identifier passed in chat requests.
// Default: qwen3
Model string `toml:"model"`
// ExtraContext is custom text appended to all system prompts.
// Useful for domain-specific details: house style, currency, location, etc.
// Optional; defaults to empty.
ExtraContext string `toml:"extra_context"`
}
LLM holds settings for the local LLM inference backend.
Click to show internal directories.
Click to hide internal directories.