Documentation
¶
Overview ¶
Package llm provides a unified HTTP client for LLM APIs. Supports Ollama (local) and OpenAI-compatible APIs (OpenRouter, Groq, OpenAI).
Configuration priority: gitconfig > environment variable > default value
Example gitconfig (~/.gitconfig):
[gitflow]
llm-api-key = sk-or-v1-xxxxx
llm-model = mistralai/devstral-2512:free
llm-temperature = 0.3
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func GetConcurrency ¶
func GetConcurrency() int
GetConcurrency returns the configured concurrency limit for parallel file analysis.
func GetDiffContext ¶
func GetDiffContext() int
GetDiffContext returns the configured diff context lines.
Types ¶
type Client ¶
type Client struct {
// contains filtered or unexported fields
}
Client is an LLM API client supporting multiple providers.
func NewClient ¶
func NewClient() *Client
NewClient creates a new LLM client from gitconfig.
Provider selection:
- If API key is set, uses OpenAI-compatible API (OpenRouter by default)
- Otherwise, uses local Ollama
API path resolution:
- User-defined llm-api-path takes highest priority
- Auto-detect from host for known providers
- Fall back to OpenAI-compatible path (/v1/chat/completions)
func (*Client) Generate ¶
func (c *Client) Generate(ctx context.Context, model, prompt string, opts ...GenerateOptions) (string, error)
Generate calls the LLM API to generate text. Returns the generated text or error after retries exhausted.
func (*Client) GetCommitPrompt ¶
GetCommitPrompt returns the custom commit generation prompt for the specified language, or empty string to use the default prompt.
func (*Client) GetFilePrompt ¶
GetFilePrompt returns the custom file analysis prompt, or empty string for default.
type GenerateOptions ¶
GenerateOptions configures a generation request.