llm

package
v0.2.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 19, 2026 License: MIT Imports: 12 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func AnthropicAPIURL

func AnthropicAPIURL() string

AnthropicAPIURL returns the current Anthropic API endpoint URL. Exposed for use by integration tests via httptest servers.

func BuildSystemPrompt

func BuildSystemPrompt(p *profile.Profile, strict bool) string

BuildSystemPrompt constructs the system prompt with optional profile rules and strict mode injection.

func BuildUserPrompt

func BuildUserPrompt(s *spec.Spec, contextFiles []ctx.ContextFile) string

BuildUserPrompt constructs the user prompt with the spec, optional context files, and the JSON schema example.

func OpenAIAPIURL added in v0.2.0

func OpenAIAPIURL() string

OpenAIAPIURL returns the current OpenAI API endpoint URL. Exposed for use by integration tests via httptest servers.

func SetAnthropicAPIURL

func SetAnthropicAPIURL(u string)

SetAnthropicAPIURL overrides the Anthropic API endpoint URL. Intended for use in tests only.

func SetOpenAIAPIURL added in v0.2.0

func SetOpenAIAPIURL(u string)

SetOpenAIAPIURL overrides the OpenAI API endpoint URL. Intended for use in tests only.

Types

type Provider

type Provider interface {
	Complete(ctx context.Context, req *Request) (*Response, error)
}

Provider is the interface for LLM completion backends.

func NewProvider

func NewProvider(providerModel string) (Provider, error)

NewProvider parses a "provider:model" string and returns the appropriate Provider. The API key is read from the environment at construction time and validated immediately. Example: "anthropic:claude-sonnet-4-6" or "openai:gpt-4o".

type Request

type Request struct {
	SystemPrompt string
	UserPrompt   string
	Temperature  float64
	MaxTokens    int
	// Model overrides the provider's configured model when non-empty.
	Model string
}

Request holds the parameters for an LLM completion call.

type Response

type Response struct {
	Content string
	Model   string // actual model used, echoed back for meta
}

Response holds the result of an LLM completion call.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL