llmprovider

package
v0.38.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 10, 2026 License: Apache-2.0 Imports: 1 Imported by: 0

Documentation

Overview

Package llmprovider defines a unified interface for LLM providers.

Each provider (Gemini, Anthropic, OpenAI) has a native SDK implementation that supports provider-specific features like Gemini thought_signatures, Anthropic thinking blocks, and OpenAI encrypted reasoning content.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type CallOption

type CallOption func(*CallOptions)

CallOption configures a GenerateContent call.

func WithMaxTokens

func WithMaxTokens(n int) CallOption

func WithStopWords

func WithStopWords(words []string) CallOption

func WithTemperature

func WithTemperature(t float64) CallOption

func WithTools

func WithTools(tools []Tool) CallOption

func WithTopK

func WithTopK(k int) CallOption

func WithTopP

func WithTopP(p float64) CallOption

type CallOptions

type CallOptions struct {
	MaxTokens   int
	Temperature float64
	TopP        float64
	TopK        int
	StopWords   []string
	Tools       []Tool
}

CallOptions holds all configurable parameters for a GenerateContent call.

type Choice

type Choice struct {
	// Content is the text content of the response.
	Content string

	// StopReason is why the model stopped generating.
	StopReason string

	// ToolCalls requested by the model. These preserve ThoughtSignature
	// so they can be echoed back in the next request.
	ToolCalls []ToolCallPart

	// Thinking contains reasoning/thinking content if the model produced any.
	Thinking []ThinkingPart

	// GenerationInfo holds arbitrary provider-specific metadata (token
	// counts, safety ratings, etc.).
	GenerationInfo map[string]any
}

Choice is a single response candidate.

type FunctionDef

type FunctionDef struct {
	Name        string
	Description string
	Parameters  any // JSON Schema
}

FunctionDef describes a callable function.

type Message

type Message struct {
	Role  Role
	Parts []Part
}

Message is a single message in a conversation.

func TextMessage

func TextMessage(role Role, text string) Message

TextMessage creates a Message with a single text part.

type Part

type Part interface {
	// contains filtered or unexported methods
}

Part is a piece of content within a message. Concrete types: TextPart, ToolCallPart, ToolResultPart, ThinkingPart.

type Provider

type Provider interface {
	GenerateContent(ctx context.Context, messages []Message, options ...CallOption) (*Response, error)
}

Provider is the core interface that all LLM provider clients implement.

type Response

type Response struct {
	Choices []*Choice
	Usage   Usage
}

Response is the result of a GenerateContent call.

type Role

type Role string

Role identifies the sender of a message.

const (
	RoleSystem Role = "system"
	RoleHuman  Role = "human"
	RoleAI     Role = "ai"
	RoleTool   Role = "tool"
)

type TextPart

type TextPart struct {
	Text string
}

TextPart is plain text content.

type ThinkingPart

type ThinkingPart struct {
	Text      string
	Signature string // Gemini thought_signature or Anthropic thinking signature
	Encrypted string // OpenAI encrypted_content or Anthropic redacted_thinking data
}

ThinkingPart holds reasoning/thinking content from the model. Different providers represent this differently:

  • Gemini: thought text + thought_signature
  • Anthropic: thinking block with signature, or redacted_thinking
  • OpenAI: encrypted reasoning content

type Tool

type Tool struct {
	Type     string
	Function *FunctionDef
}

Tool describes a tool the model can invoke.

type ToolCallPart

type ToolCallPart struct {
	ID        string
	Name      string
	Arguments string // JSON string

	// Thought indicates whether this part was produced during model thinking.
	// Must be echoed back exactly as received from the API.
	Thought bool

	// ThoughtSignature is the opaque token Gemini 3.x attaches to function
	// call parts.  It must be echoed back in subsequent requests or the API
	// returns a 400.  Nil/empty means no signature was provided.
	ThoughtSignature string
}

ToolCallPart represents a model's request to call a tool.

type ToolResultPart

type ToolResultPart struct {
	ToolCallID string
	Name       string
	Content    string
}

ToolResultPart is the response from executing a tool.

type Usage

type Usage struct {
	InputTokens  int
	OutputTokens int
	TotalTokens  int
}

Usage tracks token usage metrics.

Directories

Path Synopsis
Package anthropicprovider implements the llmprovider.Provider interface using the official Anthropic Go SDK (github.com/anthropics/anthropic-sdk-go).
Package anthropicprovider implements the llmprovider.Provider interface using the official Anthropic Go SDK (github.com/anthropics/anthropic-sdk-go).
Package gemini implements the llmprovider.Provider interface using the Google GenAI SDK (google.golang.org/genai).
Package gemini implements the llmprovider.Provider interface using the Google GenAI SDK (google.golang.org/genai).
Package openai implements the llmprovider.Provider interface using the official OpenAI Go SDK (github.com/openai/openai-go).
Package openai implements the llmprovider.Provider interface using the official OpenAI Go SDK (github.com/openai/openai-go).

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL