Documentation
¶
Index ¶
Constants ¶
View Source
const ( Anthropic = "anthropic" OpenAI = "openai" Ollama = "ollama" )
Supported provider type constants
Variables ¶
View Source
var ErrStreamingNotImplemented = errors.New("streaming not implemented for this provider")
ErrStreamingNotImplemented is returned by ParseStreamChunk when a provider does not yet support streaming parsing.
Functions ¶
func SupportedProviders ¶
func SupportedProviders() []string
SupportedProviders returns the list of all supported provider type names.
Types ¶
type Provider ¶
type Provider interface {
// Name returns the canonical provider name (e.g., "anthropic", "openai", "ollama")
Name() string
// DefaultStreaming reports whether this provider streams responses by default
// when the "stream" field is omitted from the request. For example, Ollama
// defaults to streaming when "stream" is not present in the request body.
DefaultStreaming() bool
// ParseRequest converts a provider-specific request into the internal format.
// Returns an error if the payload cannot be parsed.
ParseRequest(payload []byte) (*llm.ChatRequest, error)
// ParseResponse converts a provider-specific response into the internal format.
// Returns an error if the payload cannot be parsed.
ParseResponse(payload []byte) (*llm.ChatResponse, error)
// ParseStreamChunk converts a single streaming chunk into the internal format.
// Returns ErrStreamingNotImplemented if the provider doesn't support streaming yet.
// Returns (nil, nil) if the chunk should be skipped (e.g., keep-alive, comments).
ParseStreamChunk(payload []byte) (*llm.StreamChunk, error)
}
Provider defines the interface for LLM API parsing. Each provider implementation knows how to parse its specific API format into the internal representation.
Click to show internal directories.
Click to hide internal directories.