model

package
v0.1.0-dev.20260203070556 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 3, 2026 License: MIT Imports: 15 Imported by: 0

Documentation

Overview

Package model provides LLM provider configuration and client implementations.

Configuration Reference

Model configuration can be specified via CLI flags, environment variables, config file, or native keystore (for API keys only).

## Resolution Order

For each field, the first source with a value wins:

  1. CLI flags (highest priority)
  2. Environment variables
  3. Config file (~/.config/devlore/config.yaml)
  4. Native keystore (API key only, lowest priority)

## Configuration Fields

| Field        | CLI Flag           | Environment Variable       | Config Key     | Keystore         |
|--------------|--------------------|-----------------------------|----------------|------------------|
| model        | --model            | DEVLORE_MODEL              | model.model    | —                |
| api-key      | --model-api-key    | DEVLORE_MODEL_API_KEY      | model.api-key  | Account=provider |
| endpoint     | --model-endpoint   | DEVLORE_MODEL_ENDPOINT     | model.endpoint | —                |
| provider     | --model-provider   | DEVLORE_MODEL_PROVIDER     | model.provider | —                |

## Provider Field

The provider field determines:

  • API protocol: Anthropic API vs OpenAI-compatible API (different request/response formats)
  • Authentication method: "Authorization: Bearer" header vs "api-key" header (Azure)
  • Default endpoint: Each provider has a different default API URL
  • Keystore account: API keys are stored under Service=com.noblefactor.DevLore, Account=<provider>
  • Client implementation: Which Provider implementation to instantiate

Supported providers: anthropic, gemini, github, groq, ollama, openai

## Native Keystore

API keys are stored in the native OS keystore:

  • macOS: Keychain (security command)
  • Linux: libsecret (secret-tool command)
  • Windows: Credential Manager (PowerShell)

Keystore entry format:

Service: com.noblefactor.DevLore
Account: <provider>  (e.g., "anthropic", "openai")
Key:     <api-key>

## Config File Example

# ~/.config/devlore/config.yaml
model:
  model: claude-sonnet-4-20250514
  provider: anthropic

API keys should be stored in the native keystore, not the config file.

## CLI Usage Examples

# Using environment variables
DEVLORE_MODEL_PROVIDER=github DEVLORE_MODEL_API_KEY=$(gh auth token) writ migrate --dry-run ~/dotfiles

# Using CLI flags
writ --model-provider=github --model-api-key=$(gh auth token) migrate --dry-run ~/dotfiles

Package model provides an opaque LLM provider interface and configuration.

Provider Interface

The Provider interface is the single abstraction for all AI/LLM backends. Callers interact only through this interface—they don't need to know whether the underlying provider is Anthropic, OpenAI, Ollama, etc.

type Provider interface {
    Chat(ctx context.Context, req ChatRequest) (*ChatResponse, error)
    Name() string       // Provider name (also keystore account)
    Model() string      // Model identifier
    Endpoint() string   // API endpoint (empty = default)
    Available(ctx context.Context) bool
}

The interface exposes queryable attributes (provider, model, endpoint) but never exposes the API key for security.

Provider Field Uses

The provider field in configuration determines:

  1. API Protocol: Anthropic API vs OpenAI-compatible API (different request/response formats)
  2. Authentication Method: "Authorization: Bearer" header vs "api-key" header (Azure)
  3. Default Endpoint: Each provider has a different default API URL
  4. Keystore Account: API keys stored under Service=com.noblefactor.DevLore, Account=<provider>
  5. Client Implementation: Which Provider implementation to instantiate

Supported Providers

| Provider     | Default Endpoint                                | Auth Header               |
|--------------|------------------------------------------------|---------------------------|
| anthropic    | https://api.anthropic.com/v1/messages          | x-api-key                 |
| gemini       | https://generativelanguage.googleapis.com/v1beta| ?key= query param        |
| github       | https://models.inference.ai.azure.com          | Authorization: Bearer     |
| groq         | https://api.groq.com/openai/v1                 | Authorization: Bearer     |
| ollama       | http://localhost:11434                         | (none)                    |
| openai       | https://api.openai.com/v1                      | Authorization: Bearer     |

See config.go for full configuration reference including CLI flags, environment variables, config file format, and resolution order.

Index

Constants

View Source
const (
	EnvModel         = "DEVLORE_MODEL"
	EnvModelAPIKey   = "DEVLORE_MODEL_API_KEY"
	EnvModelEndpoint = "DEVLORE_MODEL_ENDPOINT"
	EnvModelProvider = "DEVLORE_MODEL_PROVIDER"
)

Environment variables for model configuration (sorted alphabetically).

Variables

This section is empty.

Functions

func ConfigPath

func ConfigPath() (string, error)

ConfigPath returns the path to the devlore config file.

func SaveConfig

func SaveConfig(cfg *Config) error

SaveConfig saves model configuration to the config file. API keys are stored in the native keystore, not the config file.

Types

type AnthropicProvider

type AnthropicProvider struct {
	// contains filtered or unexported fields
}

AnthropicProvider implements Provider for Anthropic Claude.

func NewAnthropicProvider

func NewAnthropicProvider(apiKey, model string) *AnthropicProvider

NewAnthropicProvider creates an Anthropic provider.

func (*AnthropicProvider) Available

func (a *AnthropicProvider) Available(ctx context.Context) bool

Available checks if the API key is valid.

func (*AnthropicProvider) Chat

Chat sends a chat completion request to Anthropic.

func (*AnthropicProvider) Endpoint

func (a *AnthropicProvider) Endpoint() string

Endpoint returns empty string (Anthropic uses fixed endpoint).

func (*AnthropicProvider) Model

func (a *AnthropicProvider) Model() string

Model returns the model identifier (e.g., "claude-sonnet-4-20250514").

func (*AnthropicProvider) Name

func (a *AnthropicProvider) Name() string

Name returns "anthropic".

type CLIFlags

type CLIFlags struct {
	APIKey   string // --model-api-key
	Endpoint string // --model-endpoint
	Model    string // --model
	Provider string // --model-provider
}

CLIFlags holds model configuration from command-line flags. Fields are sorted alphabetically to match documentation.

func (CLIFlags) ApplyTo

func (f CLIFlags) ApplyTo(cfg *Config)

ApplyTo applies CLI flags to a config. CLI flags take highest priority.

type ChatRequest

type ChatRequest struct {
	SystemPrompt string    // System prompt (instructions)
	Messages     []Message // Conversation messages
	Temperature  float64   // 0.0 for deterministic, higher for creativity
	MaxTokens    int       // Maximum response tokens (0 = provider default)
	JSONMode     bool      // Request JSON output if supported
}

ChatRequest represents a chat completion request.

type ChatResponse

type ChatResponse struct {
	Content      string // Response text
	FinishReason string // "stop", "length", etc.
	TokensUsed   int    // Total tokens consumed
}

ChatResponse contains the AI response.

type Config

type Config struct {
	Provider string `yaml:"provider"` // ollama, anthropic, openai, azure-openai
	Model    string `yaml:"model"`    // Model name (e.g., "llama3.1:8b", "claude-sonnet-4-20250514")
	Endpoint string `yaml:"endpoint"` // API endpoint (optional, for custom/azure)
	APIKey   string `yaml:"api_key"`  // API key (for cloud providers)
}

Config holds AI provider configuration per ADR-017.

func DefaultConfig

func DefaultConfig() Config

DefaultConfig returns the default configuration (Ollama).

func LoadConfig

func LoadConfig() (*Config, error)

LoadConfig loads model configuration with consistent fallback:

CLI (handled by caller) → Environment → Config → Keystore

First source with a value wins. If config has api-key, keystore is not checked.

type GeminiProvider

type GeminiProvider struct {
	// contains filtered or unexported fields
}

GeminiProvider implements Provider for Google's Gemini API.

func NewGeminiProvider

func NewGeminiProvider(apiKey, model string) *GeminiProvider

NewGeminiProvider creates a Gemini provider.

func (*GeminiProvider) Available

func (g *GeminiProvider) Available(ctx context.Context) bool

Available checks if the API key is set.

func (*GeminiProvider) Chat

Chat sends a chat completion request to Gemini.

func (*GeminiProvider) Endpoint

func (g *GeminiProvider) Endpoint() string

Endpoint returns the API endpoint URL.

func (*GeminiProvider) Model

func (g *GeminiProvider) Model() string

Model returns the model identifier.

func (*GeminiProvider) Name

func (g *GeminiProvider) Name() string

Name returns "gemini" for keystore lookup.

type GroqProvider

type GroqProvider struct {
	*OpenAIProvider
	// contains filtered or unexported fields
}

GroqProvider implements Provider for Groq's OpenAI-compatible API. Groq provides extremely fast inference using custom LPU hardware.

func NewGroqProvider

func NewGroqProvider(apiKey, model string) *GroqProvider

NewGroqProvider creates a Groq provider. Groq uses OpenAI-compatible API at api.groq.com.

func (*GroqProvider) Available

func (g *GroqProvider) Available(ctx context.Context) bool

Available checks if the API key is set.

func (*GroqProvider) Name

func (g *GroqProvider) Name() string

Name returns "groq" for keystore lookup.

type Message

type Message struct {
	Role    Role   // user, assistant
	Content string // Message content
}

Message represents a conversation turn.

type OllamaProvider

type OllamaProvider struct {
	// contains filtered or unexported fields
}

OllamaProvider implements Provider for local Ollama.

func NewOllamaProvider

func NewOllamaProvider(endpoint, model string) *OllamaProvider

NewOllamaProvider creates an Ollama provider.

func (*OllamaProvider) Available

func (o *OllamaProvider) Available(ctx context.Context) bool

Available checks if Ollama is running and the model is available.

func (*OllamaProvider) Chat

Chat sends a chat completion request to Ollama.

func (*OllamaProvider) Endpoint

func (o *OllamaProvider) Endpoint() string

Endpoint returns the Ollama API endpoint URL.

func (*OllamaProvider) Model

func (o *OllamaProvider) Model() string

Model returns the model identifier (e.g., "llama3.1:8b").

func (*OllamaProvider) Name

func (o *OllamaProvider) Name() string

Name returns "ollama".

type OpenAIProvider

type OpenAIProvider struct {
	// contains filtered or unexported fields
}

OpenAIProvider implements Provider for OpenAI and compatible APIs.

func NewOpenAIProvider

func NewOpenAIProvider(apiKey, model, endpoint string) *OpenAIProvider

NewOpenAIProvider creates an OpenAI provider. endpoint can be empty for api.openai.com, or set for Azure/compatible APIs.

func (*OpenAIProvider) Available

func (o *OpenAIProvider) Available(ctx context.Context) bool

Available checks if the API key is set.

func (*OpenAIProvider) Chat

Chat sends a chat completion request to OpenAI.

func (*OpenAIProvider) Endpoint

func (o *OpenAIProvider) Endpoint() string

Endpoint returns the API endpoint URL.

func (*OpenAIProvider) Model

func (o *OpenAIProvider) Model() string

Model returns the model identifier (e.g., "gpt-4-turbo").

func (*OpenAIProvider) Name

func (o *OpenAIProvider) Name() string

Name returns "openai".

type Provider

type Provider interface {
	// Chat sends messages and returns a response.
	Chat(ctx context.Context, req ChatRequest) (*ChatResponse, error)

	// Name returns the provider name (e.g., "ollama", "anthropic", "openai").
	// This is the same value used for keystore account lookup.
	Name() string

	// Model returns the model identifier (e.g., "llama3.1:8b", "claude-sonnet-4-20250514").
	Model() string

	// Endpoint returns the API endpoint URL.
	// Returns empty string if using provider's default endpoint.
	Endpoint() string

	// Available returns true if the provider is ready to use.
	// For cloud providers, this typically means the API key is configured.
	// For local providers (Ollama), this checks if the service is running.
	Available(ctx context.Context) bool
}

Provider is the opaque interface for AI/LLM backends.

Callers interact only through this interface—they don't need to know whether the underlying provider is Anthropic, OpenAI, Ollama, etc. The interface exposes queryable attributes (provider, model, endpoint) but never exposes the API key for security.

func EnsureProvider

func EnsureProvider(ctx context.Context, interactive bool, cliFlags CLIFlags) (Provider, error)

EnsureProvider returns a configured AI provider, prompting the user if needed. If interactive is false, fails immediately if no provider is configured.

Lookup priority:

  1. CLI flags (passed via cliFlags parameter)
  2. Environment: DEVLORE_MODEL_PROVIDER, DEVLORE_MODEL_API_KEY, etc.
  3. Config file: ~/.config/devlore/config.yaml
  4. API key from native keystore (for configured provider)
  5. Auto-detect from common API key env vars (GROQ_API_KEY, GEMINI_API_KEY, etc.)
  6. Auto-detect from native keystore entries
  7. Fallback to Ollama if available locally
  8. Interactive prompt (only if interactive=true)

func NewProvider

func NewProvider(cfg Config) (Provider, error)

NewProvider creates a provider from configuration.

type Role

type Role string

Role identifies the message sender.

const (
	RoleUser      Role = "user"
	RoleAssistant Role = "assistant"
)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL