llm

package
v3.0.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 4, 2026 License: MIT Imports: 10 Imported by: 0

Documentation

Overview

Package llm provides a unified HTTP client for LLM APIs. Supports Ollama (local) and OpenAI-compatible APIs (OpenRouter, Groq, OpenAI).

Configuration priority: gitconfig > environment variable > default value

Example gitconfig (~/.gitconfig):

[gitflow]
    llm-api-key = sk-or-v1-xxxxx
    llm-model = mistralai/devstral-2512:free
    llm-temperature = 0.3

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func GetConcurrency

func GetConcurrency() int

GetConcurrency returns the configured concurrency limit for parallel file analysis.

func GetDiffContext

func GetDiffContext() int

GetDiffContext returns the configured diff context lines.

Types

type Client

type Client struct {
	// contains filtered or unexported fields
}

Client is an LLM API client supporting multiple providers.

func NewClient

func NewClient() *Client

NewClient creates a new LLM client from gitconfig.

Provider selection:

  • If API key is set, uses OpenAI-compatible API (OpenRouter by default)
  • Otherwise, uses local Ollama

API path resolution:

  1. User-defined llm-api-path takes highest priority
  2. Auto-detect from host for known providers
  3. Fall back to OpenAI-compatible path (/v1/chat/completions)

func (*Client) Generate

func (c *Client) Generate(ctx context.Context, model, prompt string, opts ...GenerateOptions) (string, error)

Generate calls the LLM API to generate text. Returns the generated text or error after retries exhausted.

func (*Client) GetCommitPrompt

func (c *Client) GetCommitPrompt(lang string) string

GetCommitPrompt returns the custom commit generation prompt for the specified language, or empty string to use the default prompt.

func (*Client) GetFilePrompt

func (c *Client) GetFilePrompt() string

GetFilePrompt returns the custom file analysis prompt, or empty string for default.

func (*Client) GetLang

func (c *Client) GetLang() string

GetLang returns the configured language.

func (*Client) GetModel

func (c *Client) GetModel() string

GetModel returns the configured model name.

type GenerateOptions

type GenerateOptions struct {
	System      string
	Temperature float64
}

GenerateOptions configures a generation request.

type Provider

type Provider string

Provider represents the LLM provider type.

const (
	ProviderOllama     Provider = "ollama"
	ProviderGroq       Provider = "groq"
	ProviderOpenRouter Provider = "openrouter"
	ProviderOpenAI     Provider = "openai"
)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL