Documentation
¶
Overview ¶
Package cache implements a configurable response cache for agents.
The cache maps a normalized user question to a previously produced agent response so that asking the exact same (according to the configured normalization rules) question again returns the same answer without invoking the model.
Two storage backends are supported:
- an in-memory map (the default), which keeps entries for the lifetime of the process;
- a JSON-file backed store, which persists entries to disk so they survive restarts.
File-backed caches are concurrent-write-safe across processes. Every Cache.Store takes an exclusive advisory lock on a sibling `<path>.lock` file (POSIX flock / Windows LockFileEx), reloads the current on-disk state under the lock, merges its entry, and writes back atomically via a temp file + rename. Two processes simultaneously caching different keys both see their writes preserved; the lock serializes the read-modify-write window so neither can clobber the other. Cache.Lookup reloads the in-memory map when the file's mtime has advanced since its last load, so cross-process writes become visible without a restart.
Two normalization options are exposed:
- case sensitivity: when disabled (the default), questions are compared case-insensitively;
- blank trimming: when enabled, leading and trailing whitespace is stripped before comparison.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Cache ¶
type Cache struct {
// contains filtered or unexported fields
}
Cache stores agent responses keyed on the user's question. It is safe for concurrent use by multiple goroutines, and — when configured with a Path — by multiple processes (see the package doc).
func New ¶
New builds a Cache from the given Config. It returns (nil, nil) when caching is disabled, allowing callers to short-circuit with a simple nil check.
func (*Cache) Lookup ¶
Lookup returns the stored response for the given question and a boolean indicating whether the question was found.
When the cache is file-backed and the file has been modified since our last load (typically by another process), the in-memory state is reloaded before the lookup so cross-process writes are visible.
func (*Cache) Store ¶
Store records the response for the given question, replacing any existing entry with the same normalized key. Storing the same (question, response) pair twice is a no-op and skips the file rewrite — useful when an agent's stop hook re-fires with the same content (e.g. a cache-replay turn).
When the cache is file-backed, the operation is serialized across processes via an advisory lock on a sibling .lock file: we take the lock, reload the on-disk state, merge our entry, write atomically (temp file + fsync + rename), and release. This guarantees that two processes simultaneously storing different keys both see their writes preserved on disk; in-process callers serialize via c.mu.
type Config ¶
type Config struct {
// Enabled toggles the cache on or off. When false, New returns nil.
Enabled bool `json:"enabled,omitempty" yaml:"enabled,omitempty"`
// CaseSensitive controls whether the question matching is
// case-sensitive. The default (false) means "Hello" and "hello"
// are treated as the same question.
CaseSensitive bool `json:"case_sensitive,omitempty" yaml:"case_sensitive,omitempty"`
// TrimSpaces controls whether leading and trailing whitespace is
// removed from questions before they are compared. The default
// (false) preserves whitespace.
TrimSpaces bool `json:"trim_spaces,omitempty" yaml:"trim_spaces,omitempty"`
// Path, when non-empty, selects a JSON-file backed cache stored at
// the given path. When empty, the cache lives only in memory.
Path string `json:"path,omitempty" yaml:"path,omitempty"`
}
Config describes how a Cache should normalize keys and where it should store entries.