providers

package
v0.13.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 8, 2026 License: MIT Imports: 4 Imported by: 0

Documentation

Overview

Package providers contains LLM provider implementations for Iris.

Each provider is implemented in its own subpackage (e.g., providers/openai, providers/anthropic). Providers implement the core.Provider interface.

Provider Interface

All providers must implement core.Provider:

type Provider interface {
    ID() string
    Models() []ModelInfo
    Supports(feature Feature) bool
    Chat(ctx context.Context, req *ChatRequest) (*ChatResponse, error)
    StreamChat(ctx context.Context, req *ChatRequest) (*ChatStream, error)
}

Concurrency

Providers SHOULD be safe for concurrent calls. If a provider cannot be concurrent-safe, it MUST document this limitation.

Streaming

StreamChat returns a *ChatStream (not a raw channel) to carry errors and final tool calls consistently. Providers MUST:

  • Close all channels (Ch, Err, Final) when finished
  • Terminate promptly on context cancellation
  • Send at most one error on Err
  • Send exactly one response on Final (or zero on setup failure)

Feature Detection

Use Supports() to check provider capabilities before making requests:

if p.Supports(core.FeatureToolCalling) {
    // Safe to use tools
}

Index

Constants

View Source
const (
	FeatureChat          = core.FeatureChat
	FeatureChatStreaming = core.FeatureChatStreaming
	FeatureToolCalling   = core.FeatureToolCalling
)

Re-export feature constants.

View Source
const (
	RoleSystem    = core.RoleSystem
	RoleUser      = core.RoleUser
	RoleAssistant = core.RoleAssistant
)

Re-export role constants.

Variables

View Source
var (
	ErrUnauthorized  = core.ErrUnauthorized
	ErrRateLimited   = core.ErrRateLimited
	ErrBadRequest    = core.ErrBadRequest
	ErrServer        = core.ErrServer
	ErrNetwork       = core.ErrNetwork
	ErrDecode        = core.ErrDecode
	ErrModelRequired = core.ErrModelRequired
	ErrNoMessages    = core.ErrNoMessages
)

Re-export sentinel errors.

Functions

func Create

func Create(name, apiKey string) (core.Provider, error)

Create creates a new provider instance by name with the given API key. Returns an error if the provider is not registered.

func IsRegistered

func IsRegistered(name string) bool

IsRegistered returns true if a provider with the given name is registered.

func List

func List() []string

List returns the names of all registered providers in sorted order.

func Register

func Register(name string, factory ProviderFactory)

Register adds a provider factory to the registry. It is typically called from a provider's init() function. If a provider with the same name is already registered, it will be overwritten.

Example usage in a provider package:

func init() {
    providers.Register("openai", func(apiKey string) core.Provider {
        return New(apiKey)
    })
}

Types

type ChatChunk

type ChatChunk = core.ChatChunk

ChatChunk represents an incremental streaming response.

type ChatRequest

type ChatRequest = core.ChatRequest

ChatRequest represents a request to a chat model.

type ChatResponse

type ChatResponse = core.ChatResponse

ChatResponse represents a response from a chat model.

type ChatStream

type ChatStream = core.ChatStream

ChatStream represents a streaming response from a provider.

type Feature

type Feature = core.Feature

Feature represents a capability that a provider may support.

type Message

type Message = core.Message

Message represents a single message in a conversation.

type ModelID

type ModelID = core.ModelID

ModelID is a string identifier for a model.

type ModelInfo

type ModelInfo = core.ModelInfo

ModelInfo describes a model available from a provider.

type Provider

type Provider = core.Provider

Provider is the interface that LLM providers must implement.

type ProviderError

type ProviderError = core.ProviderError

ProviderError represents an error returned by a provider.

type ProviderFactory

type ProviderFactory func(apiKey string) core.Provider

ProviderFactory creates a provider instance with the given API key. Some providers (like Ollama) may ignore the key parameter.

func Get

func Get(name string) ProviderFactory

Get retrieves a provider factory by name. Returns nil if the provider is not registered.

type Role

type Role = core.Role

Role represents a message participant role.

type TokenUsage

type TokenUsage = core.TokenUsage

TokenUsage tracks token consumption for a request.

type ToolCall

type ToolCall = core.ToolCall

ToolCall represents a tool invocation requested by the model.

Directories

Path Synopsis
Package anthropic provides an Anthropic API provider implementation for Iris.
Package anthropic provides an Anthropic API provider implementation for Iris.
Package gemini provides a Google Gemini API provider implementation for Iris.
Package gemini provides a Google Gemini API provider implementation for Iris.
Package huggingface provides an LLM provider implementation for Hugging Face Inference Providers.
Package huggingface provides an LLM provider implementation for Hugging Face Inference Providers.
internal
normalize
Package normalize provides shared provider error normalization helpers.
Package normalize provides shared provider error normalization helpers.
toolcalls
Package toolcalls provides shared streaming tool-call assembly utilities.
Package toolcalls provides shared streaming tool-call assembly utilities.
Package ollama provides an Ollama provider implementation for the Iris SDK.
Package ollama provides an Ollama provider implementation for the Iris SDK.
Package openai provides an OpenAI API provider implementation for Iris.
Package openai provides an OpenAI API provider implementation for Iris.
Package perplexity provides a Perplexity Search API provider implementation for Iris.
Package perplexity provides a Perplexity Search API provider implementation for Iris.
Package voyageai provides a Voyage AI API provider implementation for Iris.
Package voyageai provides a Voyage AI API provider implementation for Iris.
Package xai provides an xAI Grok API provider implementation for Iris.
Package xai provides an xAI Grok API provider implementation for Iris.
Package zai provides a Z.ai GLM API provider implementation for Iris.
Package zai provides a Z.ai GLM API provider implementation for Iris.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL