Documentation
¶
Overview ¶
Package llm provides configuration types and public API for the thv llm command group, which bridges AI coding tools to OIDC-protected LLM gateways.
Two authentication modes are planned:
- Proxy mode: a localhost reverse proxy that injects fresh OIDC tokens for tools that only accept static API keys (e.g. Cursor).
- Token helper mode: thv llm token prints a fresh JWT to stdout, suitable for use as apiKeyHelper or auth.command in OIDC-capable tools (e.g. Claude Code).
Both modes are under active development; the corresponding CLI commands currently return not-implemented errors.
Configuration is persisted in ToolHive's config.yaml under the llm: key via the existing UpdateConfig() mechanism.
Index ¶
- Constants
- Variables
- func DeleteCachedTokens(ctx context.Context, provider pkgsecrets.Provider) error
- func DeriveSecretKey(gatewayURL, issuer string) string
- func PurgeTokens(ctx context.Context, errOut io.Writer, provider pkgsecrets.Provider)
- func SanitizeTokenError(err error) string
- func Setup(ctx context.Context, out, errOut io.Writer, gm GatewayManager, ...) error
- func Teardown(ctx context.Context, out, errOut io.Writer, gm GatewayManager, ...) error
- type Config
- type ConfigUpdater
- type GatewayManager
- type LoginFunc
- type OIDCConfig
- type ProxyConfig
- type SetOptions
- type TokenRefUpdater
- type TokenSource
- type ToolApplyConfig
- type ToolConfig
Constants ¶
const (
// DefaultProxyListenPort is the default port the localhost proxy listens on.
DefaultProxyListenPort = 14000
)
Variables ¶
var ErrTokenRequired = errors.New(
"LLM gateway authentication required: no cached credentials found; " +
"complete an interactive login first (\"thv llm setup\" — coming soon)",
)
ErrTokenRequired is returned when a fresh token is needed but no cached or refreshable token exists and the caller is non-interactive (browser flow disabled). The user must first complete an interactive login so that a refresh token is persisted for subsequent non-interactive calls.
Functions ¶
func DeleteCachedTokens ¶
func DeleteCachedTokens(ctx context.Context, provider pkgsecrets.Provider) error
DeleteCachedTokens removes all cached OIDC tokens stored under the LLM scope via the provided secrets provider. It is a no-op if the provider does not support listing or deletion (e.g. the environment provider), since such providers cannot hold cached tokens.
func DeriveSecretKey ¶
DeriveSecretKey computes the secrets-provider key for an LLM gateway refresh token. The formula is: LLM_OAUTH_<8 hex chars> where the hex is derived from sha256(gatewayURL + "\x00" + issuer)[:4].
func PurgeTokens ¶ added in v0.26.0
PurgeTokens deletes all cached OIDC tokens from the provided secrets provider. Errors are logged as warnings rather than returned.
func SanitizeTokenError ¶
SanitizeTokenError returns a log-safe string for a token-source error. If err wraps *oauth2.RetrieveError, only the error code and description are included — never the raw response body, which may contain bearer material echoed back by the IdP.
func Setup ¶ added in v0.26.0
func Setup( ctx context.Context, out, errOut io.Writer, gm GatewayManager, provider ConfigUpdater, login LoginFunc, inlineOpts SetOptions, ) error
Setup configures all detected AI tools to use the LLM gateway.
It applies inlineOpts in-memory before login so a failed login leaves no persisted state. Tool config files are patched only after login succeeds; on any persistence failure the patches are rolled back.
func Teardown ¶ added in v0.26.0
func Teardown( ctx context.Context, out, errOut io.Writer, gm GatewayManager, targetTool string, purgeTokens bool, provider ConfigUpdater, secretsProvider pkgsecrets.Provider, ) error
Teardown removes LLM gateway configuration from all (or one) configured tools.
targetTool selects which tool to revert; pass an empty string to revert all configured tools. An error is returned when targetTool is non-empty but not found in the configured tool list.
If secretsProvider is non-nil and purgeTokens is true, cached OIDC tokens are deleted after the config update succeeds.
Types ¶
type Config ¶
type Config struct {
GatewayURL string `yaml:"gateway_url,omitempty" json:"gateway_url,omitempty"`
OIDC OIDCConfig `yaml:"oidc,omitempty" json:"oidc,omitempty"`
Proxy ProxyConfig `yaml:"proxy,omitempty" json:"proxy,omitempty"`
ConfiguredTools []ToolConfig `yaml:"configured_tools,omitempty" json:"configured_tools,omitempty"`
}
Config holds all LLM gateway settings persisted under the llm: key in ToolHive's config.yaml.
func (*Config) EffectiveProxyPort ¶
EffectiveProxyPort returns the configured proxy listen port, or DefaultProxyListenPort if none is set.
func (*Config) IsConfigured ¶
IsConfigured reports whether the minimum required fields are present for the LLM gateway to be usable: gateway URL, OIDC issuer, and OIDC client ID.
func (*Config) SetFields ¶
func (c *Config) SetFields(opts SetOptions) error
SetFields applies the non-zero fields from the provided options to the config and validates the result. If the mandatory trio (gateway_url, oidc.issuer, oidc.client_id) is present after the update, full validation runs; otherwise only format/range validation runs to catch bad values early while still allowing incremental configuration.
func (*Config) Show ¶
Show writes a human-readable representation of the config to w. If the config is not yet configured it prints a hint to run "config set".
func (*Config) Validate ¶
Validate performs full validation of the LLM config, including HTTPS enforcement, port range checks, and OIDC field requirements.
func (*Config) ValidatePartial ¶
ValidatePartial validates any fields that are explicitly set, without requiring the mandatory trio (gateway_url, oidc.issuer, oidc.client_id). Use this to catch URL format or port range errors during incremental configuration, before all required fields have been provided.
type ConfigUpdater ¶ added in v0.26.0
type ConfigUpdater interface {
// GetLLMConfig returns the current LLM section of the config.
GetLLMConfig() Config
// UpdateLLMConfig atomically reads, applies fn, and persists the LLM config.
UpdateLLMConfig(fn func(*Config) error) error
}
ConfigUpdater is the subset of config.Provider used by Setup and Teardown. Defined here so pkg/llm does not import pkg/config.
type GatewayManager ¶ added in v0.26.0
type GatewayManager interface {
// DetectedLLMGatewayClients returns tool names for all installed LLM-gateway-capable tools.
DetectedLLMGatewayClients() []string
// ConfigureLLMGateway patches the tool's config file and returns the config path.
ConfigureLLMGateway(clientType string, cfg ToolApplyConfig) (string, error)
// LLMGatewayModeFor returns "direct", "proxy", or "" for the given client.
LLMGatewayModeFor(clientType string) string
// RevertLLMGateway removes the LLM gateway settings from the tool's config file.
RevertLLMGateway(clientType, configPath string) error
}
GatewayManager is the subset of client.ClientManager used by Setup and Teardown. Defined here so pkg/llm does not import pkg/client.
type LoginFunc ¶ added in v0.26.0
LoginFunc performs the interactive OIDC login during setup. It is a parameter so that tests can inject a no-op without touching the keyring.
type OIDCConfig ¶
type OIDCConfig = pkgoidc.ClientConfig
OIDCConfig is a type alias for oidc.ClientConfig, holding OIDC provider settings and cached token state for the LLM gateway. Using a type alias ensures this type stays in sync with pkg/config.RegistryOAuthConfig, which is also an alias for the same underlying type.
type ProxyConfig ¶
type ProxyConfig struct {
ListenPort int `yaml:"listen_port,omitempty" json:"listen_port,omitempty"`
}
ProxyConfig holds configuration for the localhost reverse proxy.
type SetOptions ¶
type SetOptions struct {
GatewayURL string
Issuer string
ClientID string
Audience string
ProxyPort int
CallbackPort int
}
SetOptions carries the flag values for the "config set" command. Zero values are treated as "not provided" and leave the existing config field unchanged.
type TokenRefUpdater ¶
type TokenRefUpdater = tokensource.ConfigPersister
TokenRefUpdater is a callback invoked when the refresh token changes — either after a successful browser flow (initial login) or when the OIDC provider rotates the refresh token during a refresh. It persists the secret key and the new token expiry into the application config so future CLI invocations can restore the session. It is NOT called on routine access-token refreshes where the refresh token is unchanged. Callers typically wire this to config.UpdateConfig.
type TokenSource ¶
type TokenSource = tokensource.OAuthTokenSource
TokenSource provides fresh LLM gateway access tokens.
func NewTokenSource ¶
func NewTokenSource( cfg *Config, secretsProvider secrets.Provider, interactive bool, tokenRefUpdater TokenRefUpdater, ) *TokenSource
NewTokenSource creates a TokenSource for the LLM gateway. secretsProvider may be nil if the secrets store is unavailable. tokenRefUpdater is called after login/refresh to persist the token reference into config — pass nil to skip config persistence (useful in tests). Set interactive to false for non-interactive callers such as thv llm token.
type ToolApplyConfig ¶ added in v0.26.0
type ToolApplyConfig struct {
GatewayURL string // direct-mode: URL of the upstream LLM gateway
ProxyBaseURL string // proxy-mode: URL of the localhost reverse proxy
TokenHelperCommand string // direct-mode: shell command that prints a fresh token
}
ToolApplyConfig carries the values needed to configure a single tool's LLM gateway settings. Using a struct prevents positional-argument mistakes when the caller has multiple similar string values in scope.
type ToolConfig ¶
type ToolConfig struct {
// Tool is the canonical tool identifier (e.g. "claude-code", "cursor").
Tool string `yaml:"tool" json:"tool"`
// Mode is the authentication mode: "direct" or "proxy".
Mode string `yaml:"mode" json:"mode"`
// ConfigPath is the absolute path to the tool's config file that was patched.
ConfigPath string `yaml:"config_path" json:"config_path"`
}
ToolConfig records a tool that setup has configured, so teardown knows exactly what to reverse.