Documentation
¶
Overview ¶
Package provider defines the interface for LLM providers.
Index ¶
- func Available() []string
- func IsRegistered(name string) bool
- func Register(name string, factory func() (Provider, error))
- type FinishReason
- type JSONSchema
- type Message
- type Provider
- type Request
- type Response
- type ResponseStream
- type Role
- type StreamChunk
- type StreamingProvider
- type ToolCall
- type ToolCallDelta
- type ToolDef
- type Usage
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func IsRegistered ¶
IsRegistered checks if a provider is registered.
Types ¶
type FinishReason ¶
type FinishReason string
FinishReason indicates why the model stopped generating.
const ( FinishReasonStop FinishReason = "stop" FinishReasonToolCalls FinishReason = "tool_calls" FinishReasonLength FinishReason = "length" )
type JSONSchema ¶
type JSONSchema struct {
Name string `json:"name"`
Strict bool `json:"strict"`
Schema json.RawMessage `json:"schema"`
}
JSONSchema represents a JSON Schema for structured output.
type Message ¶
type Message struct {
Role Role
Content string
ToolCalls []ToolCall
ToolID string // When Role == RoleTool
}
Message represents a single message in the conversation.
type Provider ¶
type Provider interface {
// Name returns the provider identifier (e.g., "openai", "anthropic").
Name() string
// Call executes a non-streaming LLM request.
Call(ctx context.Context, req *Request) (*Response, error)
}
Provider is the core abstraction for LLM providers. All provider implementations must satisfy this interface.
type Request ¶
type Request struct {
Model string
Messages []Message
Tools []ToolDef
Temperature *float64
MaxTokens *int
TopP *float64
TopK *int
Seed *int
StopSequences []string
JSONSchema *JSONSchema // For structured output
}
Request represents a provider-agnostic LLM request.
type Response ¶
type Response struct {
Content string
ToolCalls []ToolCall
FinishReason FinishReason
Usage Usage
}
Response contains the LLM's response.
type ResponseStream ¶
type ResponseStream interface {
// Next advances to the next chunk, returns false when done.
Next() bool
// Current returns the current chunk.
Current() *StreamChunk
// Err returns any error that occurred during streaming.
Err() error
// Close releases stream resources.
Close() error
// Accumulated returns the full response accumulated so far.
Accumulated() *Response
}
ResponseStream represents a streaming response.
type StreamChunk ¶
type StreamChunk struct {
Delta string
ToolCallDelta *ToolCallDelta
FinishReason FinishReason
}
StreamChunk represents a single streaming chunk.
type StreamingProvider ¶
type StreamingProvider interface {
Provider
// CallStream executes a streaming LLM request.
CallStream(ctx context.Context, req *Request) (ResponseStream, error)
}
StreamingProvider extends Provider with streaming capability.
type ToolCallDelta ¶
ToolCallDelta represents incremental tool call data in streaming.