provider

package
v0.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 14, 2025 License: MIT Imports: 4 Imported by: 0

Documentation

Overview

Package provider defines the interface for LLM providers.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Available

func Available() []string

Available returns the names of all registered providers.

func IsRegistered

func IsRegistered(name string) bool

IsRegistered checks if a provider is registered.

func Register

func Register(name string, factory func() (Provider, error))

Register adds a provider factory to the registry. This is typically called from a provider package's init() function.

Types

type FinishReason

type FinishReason string

FinishReason indicates why the model stopped generating.

const (
	FinishReasonStop      FinishReason = "stop"
	FinishReasonToolCalls FinishReason = "tool_calls"
	FinishReasonLength    FinishReason = "length"
)

type JSONSchema

type JSONSchema struct {
	Name   string          `json:"name"`
	Strict bool            `json:"strict"`
	Schema json.RawMessage `json:"schema"`
}

JSONSchema represents a JSON Schema for structured output.

type Message

type Message struct {
	Role      Role
	Content   string
	ToolCalls []ToolCall
	ToolID    string // When Role == RoleTool
}

Message represents a single message in the conversation.

type Provider

type Provider interface {
	// Name returns the provider identifier (e.g., "openai", "anthropic").
	Name() string

	// Call executes a non-streaming LLM request.
	Call(ctx context.Context, req *Request) (*Response, error)
}

Provider is the core abstraction for LLM providers. All provider implementations must satisfy this interface.

func Get

func Get(name string) (Provider, error)

Get retrieves a provider by name. Returns an error if the provider is not registered.

type Request

type Request struct {
	Model         string
	Messages      []Message
	Tools         []ToolDef
	Temperature   *float64
	MaxTokens     *int
	TopP          *float64
	TopK          *int
	Seed          *int
	StopSequences []string
	JSONSchema    *JSONSchema // For structured output
}

Request represents a provider-agnostic LLM request.

type Response

type Response struct {
	Content      string
	ToolCalls    []ToolCall
	FinishReason FinishReason
	Usage        Usage
}

Response contains the LLM's response.

type ResponseStream

type ResponseStream interface {
	// Next advances to the next chunk, returns false when done.
	Next() bool

	// Current returns the current chunk.
	Current() *StreamChunk

	// Err returns any error that occurred during streaming.
	Err() error

	// Close releases stream resources.
	Close() error

	// Accumulated returns the full response accumulated so far.
	Accumulated() *Response
}

ResponseStream represents a streaming response.

type Role

type Role string

Role represents the message sender.

const (
	RoleSystem    Role = "system"
	RoleUser      Role = "user"
	RoleAssistant Role = "assistant"
	RoleTool      Role = "tool"
)

type StreamChunk

type StreamChunk struct {
	Delta         string
	ToolCallDelta *ToolCallDelta
	FinishReason  FinishReason
}

StreamChunk represents a single streaming chunk.

type StreamingProvider

type StreamingProvider interface {
	Provider

	// CallStream executes a streaming LLM request.
	CallStream(ctx context.Context, req *Request) (ResponseStream, error)
}

StreamingProvider extends Provider with streaming capability.

type ToolCall

type ToolCall struct {
	ID        string
	Name      string
	Arguments string // JSON string
}

ToolCall represents a tool invocation requested by the model.

type ToolCallDelta

type ToolCallDelta struct {
	ID             string
	Name           string
	ArgumentsDelta string
}

ToolCallDelta represents incremental tool call data in streaming.

type ToolDef

type ToolDef struct {
	Name        string
	Description string
	Parameters  json.RawMessage // JSON Schema
}

ToolDef defines a tool the model can use.

type Usage

type Usage struct {
	PromptTokens     int
	CompletionTokens int
	TotalTokens      int
}

Usage contains token usage statistics.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL