config

package
v1.29.4 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 17, 2026 License: Apache-2.0 Imports: 8 Imported by: 0

Documentation

Index

Constants

View Source
const (
	DefaultBaseURL      = "http://localhost:11434/v1"
	DefaultModel        = "qwen3"
	DefaultCacheTTLDays = 30
)

Variables

This section is empty.

Functions

func ExampleTOML

func ExampleTOML() string

ExampleTOML returns a commented config file suitable for writing as a starter config. Not written automatically -- offered to the user on demand.

func Path

func Path() string

Path returns the expected config file path (XDG_CONFIG_HOME/micasa/config.toml).

Types

type Config

type Config struct {
	LLM       LLM       `toml:"llm"`
	Documents Documents `toml:"documents"`
}

Config is the top-level application configuration, loaded from a TOML file.

func Load

func Load() (Config, error)

Load reads the TOML config file from the default path if it exists, falls back to defaults for any unset fields, and applies environment variable overrides last.

func LoadFromPath

func LoadFromPath(path string) (Config, error)

LoadFromPath reads the TOML config file at the given path if it exists, falls back to defaults for any unset fields, and applies environment variable overrides last.

type Documents added in v1.27.0

type Documents struct {
	// MaxFileSize is the largest file (in bytes) that can be imported as a
	// document attachment. Default: 50 MiB.
	MaxFileSize int64 `toml:"max_file_size"`

	// CacheTTLDays is the number of days an extracted document cache entry
	// is kept before being evicted on the next startup. Set to 0 to disable
	// eviction. Default: 30.
	CacheTTLDays int `toml:"cache_ttl_days"`
}

Documents holds settings for document attachments.

type LLM

type LLM struct {
	// BaseURL is the root of an OpenAI-compatible API.
	// The client appends /chat/completions, /models, etc.
	// Default: http://localhost:11434/v1 (Ollama)
	BaseURL string `toml:"base_url"`

	// Model is the model identifier passed in chat requests.
	// Default: qwen3
	Model string `toml:"model"`

	// ExtraContext is custom text appended to all system prompts.
	// Useful for domain-specific details: house style, currency, location, etc.
	// Optional; defaults to empty.
	ExtraContext string `toml:"extra_context"`
}

LLM holds settings for the local LLM inference backend.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL