Documentation
¶
Overview ¶
Package llmprovider defines a unified interface for LLM providers.
Each provider (Gemini, Anthropic, OpenAI) has a native SDK implementation that supports provider-specific features like Gemini thought_signatures, Anthropic thinking blocks, and OpenAI encrypted reasoning content.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type CallOption ¶
type CallOption func(*CallOptions)
CallOption configures a GenerateContent call.
func WithMaxTokens ¶
func WithMaxTokens(n int) CallOption
func WithStopWords ¶
func WithStopWords(words []string) CallOption
func WithTemperature ¶
func WithTemperature(t float64) CallOption
func WithTools ¶
func WithTools(tools []Tool) CallOption
func WithTopK ¶
func WithTopK(k int) CallOption
func WithTopP ¶
func WithTopP(p float64) CallOption
type CallOptions ¶
type CallOptions struct {
MaxTokens int
Temperature float64
TopP float64
TopK int
StopWords []string
Tools []Tool
}
CallOptions holds all configurable parameters for a GenerateContent call.
type Choice ¶
type Choice struct {
// Content is the text content of the response.
Content string
// StopReason is why the model stopped generating.
StopReason string
// ToolCalls requested by the model. These preserve ThoughtSignature
// so they can be echoed back in the next request.
ToolCalls []ToolCallPart
// Thinking contains reasoning/thinking content if the model produced any.
Thinking []ThinkingPart
// GenerationInfo holds arbitrary provider-specific metadata (token
// counts, safety ratings, etc.).
GenerationInfo map[string]any
}
Choice is a single response candidate.
type FunctionDef ¶
FunctionDef describes a callable function.
type Message ¶
Message is a single message in a conversation.
func TextMessage ¶
TextMessage creates a Message with a single text part.
type Part ¶
type Part interface {
// contains filtered or unexported methods
}
Part is a piece of content within a message. Concrete types: TextPart, ToolCallPart, ToolResultPart, ThinkingPart.
type Provider ¶
type Provider interface {
GenerateContent(ctx context.Context, messages []Message, options ...CallOption) (*Response, error)
}
Provider is the core interface that all LLM provider clients implement.
type ThinkingPart ¶
type ThinkingPart struct {
Text string
Signature string // Gemini thought_signature or Anthropic thinking signature
Encrypted string // OpenAI encrypted_content or Anthropic redacted_thinking data
}
ThinkingPart holds reasoning/thinking content from the model. Different providers represent this differently:
- Gemini: thought text + thought_signature
- Anthropic: thinking block with signature, or redacted_thinking
- OpenAI: encrypted reasoning content
type Tool ¶
type Tool struct {
Type string
Function *FunctionDef
}
Tool describes a tool the model can invoke.
type ToolCallPart ¶
type ToolCallPart struct {
ID string
Name string
Arguments string // JSON string
// Thought indicates whether this part was produced during model thinking.
// Must be echoed back exactly as received from the API.
Thought bool
// ThoughtSignature is the opaque token Gemini 3.x attaches to function
// call parts. It must be echoed back in subsequent requests or the API
// returns a 400. Nil/empty means no signature was provided.
ThoughtSignature string
}
ToolCallPart represents a model's request to call a tool.
type ToolResultPart ¶
ToolResultPart is the response from executing a tool.
Directories
¶
| Path | Synopsis |
|---|---|
|
Package anthropicprovider implements the llmprovider.Provider interface using the official Anthropic Go SDK (github.com/anthropics/anthropic-sdk-go).
|
Package anthropicprovider implements the llmprovider.Provider interface using the official Anthropic Go SDK (github.com/anthropics/anthropic-sdk-go). |
|
Package gemini implements the llmprovider.Provider interface using the Google GenAI SDK (google.golang.org/genai).
|
Package gemini implements the llmprovider.Provider interface using the Google GenAI SDK (google.golang.org/genai). |
|
Package openai implements the llmprovider.Provider interface using the official OpenAI Go SDK (github.com/openai/openai-go).
|
Package openai implements the llmprovider.Provider interface using the official OpenAI Go SDK (github.com/openai/openai-go). |