provider

package
v0.59.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 9, 2026 License: MIT Imports: 9 Imported by: 0

Documentation

Overview

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Package provider defines the provider abstraction for different LLM backends.

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Package provider implements LLM provider backends.

Index

Constants

View Source
const DefaultDeepSeekURL = "https://api.deepseek.com"

DefaultDeepSeekURL is the standard DeepSeek API endpoint.

View Source
const DefaultGeminiURL = "https://generativelanguage.googleapis.com/v1beta"

DefaultGeminiURL is the standard Google AI Studio API endpoint.

View Source
const DefaultGrokURL = "https://api.x.ai"

DefaultGrokURL is the standard xAI API endpoint.

View Source
const DefaultMiniMaxURL = "https://api.minimax.chat"

DefaultMiniMaxURL is the standard MiniMax API endpoint.

View Source
const DefaultOllamaURL = "http://localhost:11434"

DefaultOllamaURL is the standard Ollama server address.

View Source
const DefaultQwenURL = "https://dashscope.aliyuncs.com/compatible-mode"

DefaultQwenURL is the standard DashScope OpenAI-compatible API endpoint.

Variables

This section is empty.

Functions

func DeepSeekModelTiers added in v0.38.0

func DeepSeekModelTiers() map[string]string

DeepSeekModelTiers maps Claude tiers to DeepSeek models.

func DetectOllama added in v0.36.0

func DetectOllama() string

DetectOllama attempts to detect if Ollama is running on common ports. Returns the URL if found, empty string otherwise.

func GeminiModelTiers added in v0.37.0

func GeminiModelTiers() map[string]string

GeminiModelTiers maps Claude tiers to Gemini models.

func GrokModelTiers added in v0.54.0

func GrokModelTiers() map[string]string

GrokModelTiers maps Claude tiers to Grok models.

func IsDeepSeekAvailable added in v0.38.0

func IsDeepSeekAvailable(apiKey string) bool

IsDeepSeekAvailable checks if DeepSeek API is accessible with the given key.

func IsGeminiAvailable added in v0.37.0

func IsGeminiAvailable(apiKey string) bool

IsGeminiAvailable checks if Gemini API is accessible with the given key.

func IsGrokAvailable added in v0.54.0

func IsGrokAvailable(apiKey string) bool

IsGrokAvailable checks if Grok API is accessible with the given key.

func IsMiniMaxAvailable added in v0.54.0

func IsMiniMaxAvailable(apiKey string) bool

IsMiniMaxAvailable checks if MiniMax API is accessible with the given key.

func IsOllamaRunning added in v0.36.0

func IsOllamaRunning(baseURL string) bool

IsOllamaRunning checks if Ollama is running at the given URL.

func IsPortAvailable added in v0.36.0

func IsPortAvailable(port int) bool

IsPortAvailable checks if a port is available for binding.

func IsQwenAvailable added in v0.54.0

func IsQwenAvailable(apiKey string) bool

IsQwenAvailable checks if Qwen API is accessible with the given key.

func ListDeepSeekModels added in v0.38.0

func ListDeepSeekModels(apiKey string) ([]string, error)

ListDeepSeekModels fetches available models from the DeepSeek API.

func ListGeminiModels added in v0.37.0

func ListGeminiModels(apiKey string) ([]string, error)

ListGeminiModels fetches available models from the Gemini API.

func ListGrokModels added in v0.54.0

func ListGrokModels(apiKey string) ([]string, error)

ListGrokModels fetches available models from the Grok API.

func ListMiniMaxModels added in v0.54.0

func ListMiniMaxModels(apiKey string) ([]string, error)

ListMiniMaxModels fetches available models from the MiniMax API.

func ListOllamaModels added in v0.36.0

func ListOllamaModels(baseURL string) ([]string, error)

ListOllamaModels returns a list of available models from Ollama.

func ListQwenModels added in v0.54.0

func ListQwenModels(apiKey string) ([]string, error)

ListQwenModels fetches available models from the Qwen API.

func MiniMaxModelTiers added in v0.54.0

func MiniMaxModelTiers() map[string]string

MiniMaxModelTiers maps Claude tiers to MiniMax models.

func PullOllamaModel added in v0.36.0

func PullOllamaModel(baseURL, modelName string) error

PullOllamaModel pulls (downloads) a model to Ollama. This is useful for first-time setup.

func QwenModelTiers added in v0.54.0

func QwenModelTiers() map[string]string

QwenModelTiers maps Claude tiers to Qwen models.

func RecommendedDeepSeekModels added in v0.38.0

func RecommendedDeepSeekModels() map[string]string

RecommendedDeepSeekModels returns recommended models for different use cases.

func RecommendedGeminiModels added in v0.37.0

func RecommendedGeminiModels() map[string]string

RecommendedGeminiModels returns recommended models for different use cases.

func RecommendedGrokModels added in v0.54.0

func RecommendedGrokModels() map[string]string

RecommendedGrokModels returns recommended models for different use cases.

func RecommendedMiniMaxModels added in v0.54.0

func RecommendedMiniMaxModels() map[string]string

RecommendedMiniMaxModels returns recommended models for different use cases.

func RecommendedOllamaModels added in v0.36.0

func RecommendedOllamaModels() []string

RecommendedOllamaModels returns a list of recommended models for coding tasks.

func RecommendedQwenModels added in v0.54.0

func RecommendedQwenModels() map[string]string

RecommendedQwenModels returns recommended models for different use cases.

func WaitForDeepSeek added in v0.38.0

func WaitForDeepSeek(ctx context.Context, apiKey string, timeout time.Duration) error

WaitForDeepSeek waits for the DeepSeek API to become available.

func WaitForGemini added in v0.37.0

func WaitForGemini(ctx context.Context, apiKey string, timeout time.Duration) error

WaitForGemini waits for the Gemini API to become available.

func WaitForGrok added in v0.54.0

func WaitForGrok(ctx context.Context, apiKey string, timeout time.Duration) error

WaitForGrok waits for the Grok API to become available.

func WaitForMiniMax added in v0.54.0

func WaitForMiniMax(ctx context.Context, apiKey string, timeout time.Duration) error

WaitForMiniMax waits for the MiniMax API to become available.

func WaitForOllama added in v0.36.0

func WaitForOllama(ctx context.Context, baseURL string, timeout time.Duration) error

WaitForOllama waits for Ollama to become available with a timeout.

func WaitForQwen added in v0.54.0

func WaitForQwen(ctx context.Context, apiKey string, timeout time.Duration) error

WaitForQwen waits for the Qwen API to become available.

Types

type AnthropicProvider

type AnthropicProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

AnthropicProvider implements the Provider interface for direct Anthropic API access. This provider operates in passthrough mode - no protocol translation is needed.

func NewAnthropicProvider

func NewAnthropicProvider(baseURL string) *AnthropicProvider

NewAnthropicProvider creates a new Anthropic provider.

func NewAnthropicProviderWithKey

func NewAnthropicProviderWithKey(baseURL, apiKey string) *AnthropicProvider

NewAnthropicProviderWithKey creates a new Anthropic provider with an embedded API key. Used for multi-provider routing where each tier has its own credentials.

func (*AnthropicProvider) GetEndpointURL

func (p *AnthropicProvider) GetEndpointURL() string

GetEndpointURL returns the messages endpoint URL.

func (*AnthropicProvider) GetHeaders

func (p *AnthropicProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for Anthropic API requests.

func (*AnthropicProvider) Name

func (p *AnthropicProvider) Name() string

Name returns the provider name.

func (*AnthropicProvider) RequiresTransformation

func (p *AnthropicProvider) RequiresTransformation() bool

RequiresTransformation returns false - Anthropic is passthrough mode. The incoming request is already in Anthropic format, so no translation is needed.

func (*AnthropicProvider) SupportsStreaming

func (p *AnthropicProvider) SupportsStreaming() bool

SupportsStreaming indicates that Anthropic supports SSE streaming.

func (*AnthropicProvider) TransformModelID

func (p *AnthropicProvider) TransformModelID(modelID string) string

TransformModelID passes through the model ID unchanged for Anthropic.

type AzureProvider

type AzureProvider struct {
	Endpoint       string
	DeploymentName string
	APIVersion     string
}

AzureProvider implements the Provider interface for Azure OpenAI.

func NewAzureProvider

func NewAzureProvider(endpoint, deploymentName, apiVersion string) *AzureProvider

NewAzureProvider creates a new Azure OpenAI provider.

func (*AzureProvider) GetEndpointURL

func (p *AzureProvider) GetEndpointURL() string

GetEndpointURL returns the chat completions endpoint URL for Azure.

func (*AzureProvider) GetHeaders

func (p *AzureProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for Azure OpenAI API requests.

func (*AzureProvider) Name

func (p *AzureProvider) Name() string

Name returns the provider name.

func (*AzureProvider) RequiresTransformation

func (p *AzureProvider) RequiresTransformation() bool

RequiresTransformation indicates that Azure needs Anthropic->OpenAI translation.

func (*AzureProvider) SupportsStreaming

func (p *AzureProvider) SupportsStreaming() bool

SupportsStreaming indicates that Azure supports SSE streaming.

func (*AzureProvider) TransformModelID

func (p *AzureProvider) TransformModelID(modelID string) string

TransformModelID returns the deployment name for Azure.

type CustomProvider

type CustomProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

CustomProvider implements the Provider interface for custom OpenAI-compatible endpoints.

func NewCustomProvider

func NewCustomProvider(baseURL string) *CustomProvider

NewCustomProvider creates a new custom provider.

func NewCustomProviderWithKey

func NewCustomProviderWithKey(baseURL, apiKey string) *CustomProvider

NewCustomProviderWithKey creates a new custom provider with an embedded API key. Used for multi-provider routing where each tier has its own credentials.

func (*CustomProvider) GetEndpointURL

func (p *CustomProvider) GetEndpointURL() string

GetEndpointURL returns the chat completions endpoint URL.

func (*CustomProvider) GetHeaders

func (p *CustomProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for custom API requests.

func (*CustomProvider) Name

func (p *CustomProvider) Name() string

Name returns the provider name.

func (*CustomProvider) RequiresTransformation

func (p *CustomProvider) RequiresTransformation() bool

RequiresTransformation indicates that custom providers need Anthropic->OpenAI translation.

func (*CustomProvider) SupportsStreaming

func (p *CustomProvider) SupportsStreaming() bool

SupportsStreaming indicates that custom providers support SSE streaming.

func (*CustomProvider) TransformModelID

func (p *CustomProvider) TransformModelID(modelID string) string

TransformModelID returns the model ID as-is for custom providers.

type DeepSeekModel added in v0.38.0

type DeepSeekModel struct {
	ID      string `json:"id"`
	Object  string `json:"object"`
	Created int64  `json:"created"`
	OwnedBy string `json:"owned_by"`
}

DeepSeekModel represents a model from the DeepSeek API.

type DeepSeekModelsResponse added in v0.38.0

type DeepSeekModelsResponse struct {
	Object string          `json:"object"`
	Data   []DeepSeekModel `json:"data"`
}

DeepSeekModelsResponse is the response from the /v1/models endpoint.

type DeepSeekProvider added in v0.38.0

type DeepSeekProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

DeepSeekProvider implements the Provider interface for DeepSeek API. DeepSeek provides powerful coding and reasoning models via an OpenAI-compatible API.

func NewDeepSeekProvider added in v0.38.0

func NewDeepSeekProvider(apiKey string) *DeepSeekProvider

NewDeepSeekProvider creates a new DeepSeek provider with the default URL.

func NewDeepSeekProviderWithURL added in v0.38.0

func NewDeepSeekProviderWithURL(baseURL, apiKey string) *DeepSeekProvider

NewDeepSeekProviderWithURL creates a new DeepSeek provider with a custom URL. Useful for proxy configurations or self-hosted deployments.

func (*DeepSeekProvider) GetAPIKey added in v0.38.0

func (p *DeepSeekProvider) GetAPIKey() string

GetAPIKey returns the configured API key.

func (*DeepSeekProvider) GetEndpointURL added in v0.38.0

func (p *DeepSeekProvider) GetEndpointURL() string

GetEndpointURL returns the OpenAI-compatible chat completions endpoint URL. DeepSeek uses the standard /v1/chat/completions endpoint.

func (*DeepSeekProvider) GetHeaders added in v0.38.0

func (p *DeepSeekProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for DeepSeek API requests. DeepSeek uses Bearer token authentication like OpenAI.

func (*DeepSeekProvider) IsAvailable added in v0.38.0

func (p *DeepSeekProvider) IsAvailable() bool

IsAvailable checks if the DeepSeek API is reachable.

func (*DeepSeekProvider) ListModels added in v0.38.0

func (p *DeepSeekProvider) ListModels() ([]string, error)

ListModels returns available DeepSeek models.

func (*DeepSeekProvider) Name added in v0.38.0

func (p *DeepSeekProvider) Name() string

Name returns the provider name.

func (*DeepSeekProvider) RequiresTransformation added in v0.38.0

func (p *DeepSeekProvider) RequiresTransformation() bool

RequiresTransformation indicates that DeepSeek needs Anthropic->OpenAI translation.

func (*DeepSeekProvider) SupportsStreaming added in v0.38.0

func (p *DeepSeekProvider) SupportsStreaming() bool

SupportsStreaming indicates that DeepSeek supports SSE streaming.

func (*DeepSeekProvider) TransformModelID added in v0.38.0

func (p *DeepSeekProvider) TransformModelID(modelID string) string

TransformModelID transforms a model ID for DeepSeek. Maps Claude model names to appropriate DeepSeek equivalents.

type GeminiModel added in v0.37.0

type GeminiModel struct {
	Name                       string   `json:"name"`
	BaseModelID                string   `json:"baseModelId,omitempty"`
	Version                    string   `json:"version"`
	DisplayName                string   `json:"displayName"`
	Description                string   `json:"description"`
	InputTokenLimit            int      `json:"inputTokenLimit"`
	OutputTokenLimit           int      `json:"outputTokenLimit"`
	SupportedGenerationMethods []string `json:"supportedGenerationMethods"`
}

GeminiModel represents a model from the Gemini API.

type GeminiModelsResponse added in v0.37.0

type GeminiModelsResponse struct {
	Models []GeminiModel `json:"models"`
}

GeminiModelsResponse is the response from the /models endpoint.

type GeminiProvider added in v0.37.0

type GeminiProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

GeminiProvider implements the Provider interface for Google Gemini API. Google's Gemini models are accessed via the generativelanguage.googleapis.com API.

func NewGeminiProvider added in v0.37.0

func NewGeminiProvider(apiKey string) *GeminiProvider

NewGeminiProvider creates a new Gemini provider with the default URL.

func NewGeminiProviderWithURL added in v0.37.0

func NewGeminiProviderWithURL(baseURL, apiKey string) *GeminiProvider

NewGeminiProviderWithURL creates a new Gemini provider with a custom URL. Useful for Vertex AI or proxy configurations.

func (*GeminiProvider) GetAPIKey added in v0.37.0

func (p *GeminiProvider) GetAPIKey() string

GetAPIKey returns the configured API key.

func (*GeminiProvider) GetEndpointURL added in v0.37.0

func (p *GeminiProvider) GetEndpointURL() string

GetEndpointURL returns the OpenAI-compatible chat completions endpoint URL. Google provides an OpenAI-compatible endpoint at /v1beta/openai/chat/completions

func (*GeminiProvider) GetEndpointURLWithKey added in v0.37.0

func (p *GeminiProvider) GetEndpointURLWithKey(model string) string

GetEndpointURLWithKey returns the endpoint URL with the API key as query param. This is used for the native Gemini endpoint.

func (*GeminiProvider) GetHeaders added in v0.37.0

func (p *GeminiProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for Gemini API requests. Gemini uses an API key in the URL query parameter, not headers.

func (*GeminiProvider) GetStreamEndpointURL added in v0.37.0

func (p *GeminiProvider) GetStreamEndpointURL(model string) string

GetStreamEndpointURL returns the streaming endpoint URL.

func (*GeminiProvider) IsAvailable added in v0.37.0

func (p *GeminiProvider) IsAvailable() bool

IsAvailable checks if the Gemini API is reachable.

func (*GeminiProvider) ListModels added in v0.37.0

func (p *GeminiProvider) ListModels() ([]string, error)

ListModels returns available Gemini models.

func (*GeminiProvider) Name added in v0.37.0

func (p *GeminiProvider) Name() string

Name returns the provider name.

func (*GeminiProvider) RequiresTransformation added in v0.37.0

func (p *GeminiProvider) RequiresTransformation() bool

RequiresTransformation indicates that Gemini needs Anthropic->OpenAI translation. Google's OpenAI-compatible endpoint accepts OpenAI format directly.

func (*GeminiProvider) SupportsStreaming added in v0.37.0

func (p *GeminiProvider) SupportsStreaming() bool

SupportsStreaming indicates that Gemini supports SSE streaming.

func (*GeminiProvider) TransformModelID added in v0.37.0

func (p *GeminiProvider) TransformModelID(modelID string) string

TransformModelID transforms a model ID for Gemini. Maps Claude model names to appropriate Gemini equivalents.

type GrokModel added in v0.54.0

type GrokModel struct {
	ID      string `json:"id"`
	Object  string `json:"object"`
	Created int64  `json:"created"`
	OwnedBy string `json:"owned_by"`
}

GrokModel represents a model from the Grok API.

type GrokModelsResponse added in v0.54.0

type GrokModelsResponse struct {
	Object string      `json:"object"`
	Data   []GrokModel `json:"data"`
}

GrokModelsResponse is the response from the /v1/models endpoint.

type GrokProvider added in v0.54.0

type GrokProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

GrokProvider implements the Provider interface for xAI's Grok API. Grok provides powerful reasoning models via an OpenAI-compatible API.

func NewGrokProvider added in v0.54.0

func NewGrokProvider(apiKey string) *GrokProvider

NewGrokProvider creates a new Grok provider with the default URL.

func NewGrokProviderWithURL added in v0.54.0

func NewGrokProviderWithURL(baseURL, apiKey string) *GrokProvider

NewGrokProviderWithURL creates a new Grok provider with a custom URL. Useful for proxy configurations or self-hosted deployments.

func (*GrokProvider) GetAPIKey added in v0.54.0

func (p *GrokProvider) GetAPIKey() string

GetAPIKey returns the configured API key.

func (*GrokProvider) GetEndpointURL added in v0.54.0

func (p *GrokProvider) GetEndpointURL() string

GetEndpointURL returns the OpenAI-compatible chat completions endpoint URL. Grok uses the standard /v1/chat/completions endpoint.

func (*GrokProvider) GetHeaders added in v0.54.0

func (p *GrokProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for Grok API requests. Grok uses Bearer token authentication like OpenAI.

func (*GrokProvider) IsAvailable added in v0.54.0

func (p *GrokProvider) IsAvailable() bool

IsAvailable checks if the Grok API is reachable.

func (*GrokProvider) ListModels added in v0.54.0

func (p *GrokProvider) ListModels() ([]string, error)

ListModels returns available Grok models.

func (*GrokProvider) Name added in v0.54.0

func (p *GrokProvider) Name() string

Name returns the provider name.

func (*GrokProvider) RequiresTransformation added in v0.54.0

func (p *GrokProvider) RequiresTransformation() bool

RequiresTransformation indicates that Grok needs Anthropic->OpenAI translation.

func (*GrokProvider) SupportsStreaming added in v0.54.0

func (p *GrokProvider) SupportsStreaming() bool

SupportsStreaming indicates that Grok supports SSE streaming.

func (*GrokProvider) TransformModelID added in v0.54.0

func (p *GrokProvider) TransformModelID(modelID string) string

TransformModelID transforms a model ID for Grok. Maps Claude model names to appropriate Grok equivalents.

type MiniMaxModel added in v0.54.0

type MiniMaxModel struct {
	ID      string `json:"id"`
	Object  string `json:"object"`
	Created int64  `json:"created"`
	OwnedBy string `json:"owned_by"`
}

MiniMaxModel represents a model from the MiniMax API.

type MiniMaxModelsResponse added in v0.54.0

type MiniMaxModelsResponse struct {
	Object string         `json:"object"`
	Data   []MiniMaxModel `json:"data"`
}

MiniMaxModelsResponse is the response from the /v1/models endpoint.

type MiniMaxProvider added in v0.54.0

type MiniMaxProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

MiniMaxProvider implements the Provider interface for MiniMax API. MiniMax provides powerful Chinese language models via an OpenAI-compatible API.

func NewMiniMaxProvider added in v0.54.0

func NewMiniMaxProvider(apiKey string) *MiniMaxProvider

NewMiniMaxProvider creates a new MiniMax provider with the default URL.

func NewMiniMaxProviderWithGroup added in v0.54.0

func NewMiniMaxProviderWithGroup(apiKey, groupID string) *MiniMaxProvider

NewMiniMaxProviderWithGroup creates a new MiniMax provider with a group ID. Some MiniMax endpoints require a group ID for authentication.

func NewMiniMaxProviderWithURL added in v0.54.0

func NewMiniMaxProviderWithURL(baseURL, apiKey, groupID string) *MiniMaxProvider

NewMiniMaxProviderWithURL creates a new MiniMax provider with a custom URL. Useful for proxy configurations or regional deployments.

func (*MiniMaxProvider) GetAPIKey added in v0.54.0

func (p *MiniMaxProvider) GetAPIKey() string

GetAPIKey returns the configured API key.

func (*MiniMaxProvider) GetEndpointURL added in v0.54.0

func (p *MiniMaxProvider) GetEndpointURL() string

GetEndpointURL returns the OpenAI-compatible chat completions endpoint URL. MiniMax uses the standard /v1/chat/completions endpoint.

func (*MiniMaxProvider) GetGroupID added in v0.54.0

func (p *MiniMaxProvider) GetGroupID() string

GetGroupID returns the configured group ID.

func (*MiniMaxProvider) GetHeaders added in v0.54.0

func (p *MiniMaxProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for MiniMax API requests. MiniMax uses Bearer token authentication like OpenAI.

func (*MiniMaxProvider) IsAvailable added in v0.54.0

func (p *MiniMaxProvider) IsAvailable() bool

IsAvailable checks if the MiniMax API is reachable.

func (*MiniMaxProvider) ListModels added in v0.54.0

func (p *MiniMaxProvider) ListModels() ([]string, error)

ListModels returns available MiniMax models.

func (*MiniMaxProvider) Name added in v0.54.0

func (p *MiniMaxProvider) Name() string

Name returns the provider name.

func (*MiniMaxProvider) RequiresTransformation added in v0.54.0

func (p *MiniMaxProvider) RequiresTransformation() bool

RequiresTransformation indicates that MiniMax needs Anthropic->OpenAI translation.

func (*MiniMaxProvider) SupportsStreaming added in v0.54.0

func (p *MiniMaxProvider) SupportsStreaming() bool

SupportsStreaming indicates that MiniMax supports SSE streaming.

func (*MiniMaxProvider) TransformModelID added in v0.54.0

func (p *MiniMaxProvider) TransformModelID(modelID string) string

TransformModelID transforms a model ID for MiniMax. Maps Claude model names to appropriate MiniMax equivalents.

type OllamaModel added in v0.36.0

type OllamaModel struct {
	Name       string    `json:"name"`
	Model      string    `json:"model"`
	ModifiedAt time.Time `json:"modified_at"`
	Size       int64     `json:"size"`
	Digest     string    `json:"digest"`
	Details    struct {
		Format            string   `json:"format"`
		Family            string   `json:"family"`
		Families          []string `json:"families"`
		ParameterSize     string   `json:"parameter_size"`
		QuantizationLevel string   `json:"quantization_level"`
	} `json:"details"`
}

OllamaModel represents a model available in Ollama.

func GetOllamaModelInfo added in v0.36.0

func GetOllamaModelInfo(baseURL, modelName string) (*OllamaModel, error)

GetOllamaModelInfo returns detailed info about a specific model.

type OllamaProvider added in v0.36.0

type OllamaProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

OllamaProvider implements the Provider interface for Ollama local models. Ollama is an OpenAI-compatible local model server that requires no API key.

func NewOllamaProvider added in v0.36.0

func NewOllamaProvider(baseURL string) *OllamaProvider

NewOllamaProvider creates a new Ollama provider with the default URL.

func NewOllamaProviderWithKey added in v0.36.0

func NewOllamaProviderWithKey(baseURL, apiKey string) *OllamaProvider

NewOllamaProviderWithKey creates a new Ollama provider with an optional API key. Used for Ollama instances that require authentication.

func (*OllamaProvider) GetEndpointURL added in v0.36.0

func (p *OllamaProvider) GetEndpointURL() string

GetEndpointURL returns the OpenAI-compatible chat completions endpoint URL. Ollama exposes an OpenAI-compatible API at /v1/chat/completions

func (*OllamaProvider) GetHeaders added in v0.36.0

func (p *OllamaProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for Ollama API requests. Ollama typically doesn't require authentication for local use.

func (*OllamaProvider) IsRunning added in v0.36.0

func (p *OllamaProvider) IsRunning() bool

IsRunning checks if Ollama is running and accessible.

func (*OllamaProvider) ListModels added in v0.36.0

func (p *OllamaProvider) ListModels() ([]string, error)

ListModels returns a list of available models from the Ollama server.

func (*OllamaProvider) Name added in v0.36.0

func (p *OllamaProvider) Name() string

Name returns the provider name.

func (*OllamaProvider) RequiresTransformation added in v0.36.0

func (p *OllamaProvider) RequiresTransformation() bool

RequiresTransformation indicates that Ollama needs Anthropic->OpenAI translation.

func (*OllamaProvider) SupportsStreaming added in v0.36.0

func (p *OllamaProvider) SupportsStreaming() bool

SupportsStreaming indicates that Ollama supports SSE streaming.

func (*OllamaProvider) TransformModelID added in v0.36.0

func (p *OllamaProvider) TransformModelID(modelID string) string

TransformModelID returns the model ID as-is for Ollama. Ollama model names are used directly (e.g., "llama3.2", "codellama", "mistral").

type OllamaTagsResponse added in v0.36.0

type OllamaTagsResponse struct {
	Models []OllamaModel `json:"models"`
}

OllamaTagsResponse is the response from /api/tags endpoint.

type OpenAIModel added in v0.45.0

type OpenAIModel struct {
	ID      string `json:"id"`
	Object  string `json:"object"`
	Created int64  `json:"created"`
	OwnedBy string `json:"owned_by"`
}

OpenAIModel represents a model from OpenAI's models endpoint.

type OpenAIModelsResponse added in v0.45.0

type OpenAIModelsResponse struct {
	Data   []OpenAIModel `json:"data"`
	Object string        `json:"object"`
}

OpenAIModelsResponse is the response from /v1/models endpoint.

type OpenAIProvider

type OpenAIProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

OpenAIProvider implements the Provider interface for OpenAI.

func NewOpenAIProvider

func NewOpenAIProvider(baseURL string) *OpenAIProvider

NewOpenAIProvider creates a new OpenAI provider.

func NewOpenAIProviderWithKey

func NewOpenAIProviderWithKey(baseURL, apiKey string) *OpenAIProvider

NewOpenAIProviderWithKey creates a new OpenAI provider with an embedded API key. Used for multi-provider routing where each tier has its own credentials.

func (*OpenAIProvider) GetEndpointType added in v0.23.0

func (p *OpenAIProvider) GetEndpointType() translator.EndpointType

GetEndpointType returns the current endpoint type.

func (*OpenAIProvider) GetEndpointURL

func (p *OpenAIProvider) GetEndpointURL() string

GetEndpointURL returns the appropriate endpoint URL based on the target model.

func (*OpenAIProvider) GetEndpointURLForModel added in v0.23.0

func (p *OpenAIProvider) GetEndpointURLForModel(model string) string

GetEndpointURLForModel returns the endpoint URL for a specific model. This allows dynamic endpoint selection based on the model being used.

func (*OpenAIProvider) GetHeaders

func (p *OpenAIProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for OpenAI API requests.

func (*OpenAIProvider) ListModels added in v0.45.0

func (p *OpenAIProvider) ListModels(apiKey string) ([]string, error)

ListModels returns available models from OpenAI's /v1/models endpoint. Requires a valid API key to be set.

func (*OpenAIProvider) Name

func (p *OpenAIProvider) Name() string

Name returns the provider name.

func (*OpenAIProvider) RequiresResponsesAPI added in v0.23.0

func (p *OpenAIProvider) RequiresResponsesAPI() bool

RequiresResponsesAPI checks if the current target model requires the Responses API.

func (*OpenAIProvider) RequiresTransformation

func (p *OpenAIProvider) RequiresTransformation() bool

RequiresTransformation indicates that OpenAI needs Anthropic->OpenAI translation.

func (*OpenAIProvider) SetTargetModel added in v0.23.0

func (p *OpenAIProvider) SetTargetModel(model string)

SetTargetModel sets the target model and updates the endpoint type accordingly. Call this before GetEndpointURL() to ensure the correct endpoint is used.

func (*OpenAIProvider) SupportsStreaming

func (p *OpenAIProvider) SupportsStreaming() bool

SupportsStreaming indicates that OpenAI supports SSE streaming.

func (*OpenAIProvider) TransformModelID

func (p *OpenAIProvider) TransformModelID(modelID string) string

TransformModelID transforms the model ID for OpenAI.

type OpenRouterModel added in v0.45.0

type OpenRouterModel struct {
	ID               string                     `json:"id"`
	Name             string                     `json:"name"`
	Description      string                     `json:"description"`
	ContextLength    int                        `json:"context_length"`
	Pricing          OpenRouterModelPricing     `json:"pricing"`
	TopProvider      OpenRouterModelTopProvider `json:"top_provider"`
	PerRequestLimits *OpenRouterModelLimits     `json:"per_request_limits,omitempty"`
}

OpenRouterModel represents a model from OpenRouter's models endpoint.

type OpenRouterModelInfo added in v0.45.0

type OpenRouterModelInfo struct {
	ID            string
	Name          string
	Description   string
	ContextLength int
	InputPrice    float64 // Price per 1M tokens
	OutputPrice   float64 // Price per 1M tokens
	Provider      string  // Extracted from ID (e.g., "openai" from "openai/gpt-4o")
}

OpenRouterModelInfo contains detailed model information with pricing.

type OpenRouterModelLimits added in v0.50.8

type OpenRouterModelLimits struct {
	PromptTokens     int `json:"prompt_tokens"`
	CompletionTokens int `json:"completion_tokens"`
}

OpenRouterModelLimits contains per-request limits.

type OpenRouterModelPricing added in v0.50.8

type OpenRouterModelPricing struct {
	Prompt     string `json:"prompt"`     // Price per token for prompt
	Completion string `json:"completion"` // Price per token for completion
}

OpenRouterModelPricing contains pricing information for a model.

type OpenRouterModelTopProvider added in v0.50.8

type OpenRouterModelTopProvider struct {
	ContextLength  int  `json:"context_length"`
	MaxCompletions int  `json:"max_completion_tokens"`
	IsModerated    bool `json:"is_moderated"`
}

OpenRouterModelTopProvider contains top provider information.

type OpenRouterModelsResponse added in v0.45.0

type OpenRouterModelsResponse struct {
	Data []OpenRouterModel `json:"data"`
}

OpenRouterModelsResponse is the response from /models endpoint.

type OpenRouterProvider

type OpenRouterProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

OpenRouterProvider implements the Provider interface for OpenRouter.

func NewOpenRouterProvider

func NewOpenRouterProvider(baseURL string) *OpenRouterProvider

NewOpenRouterProvider creates a new OpenRouter provider.

func NewOpenRouterProviderWithKey

func NewOpenRouterProviderWithKey(baseURL, apiKey string) *OpenRouterProvider

NewOpenRouterProviderWithKey creates a new OpenRouter provider with an embedded API key. Used for multi-provider routing where each tier has its own credentials.

func (*OpenRouterProvider) GetChatModels added in v0.45.0

func (p *OpenRouterProvider) GetChatModels() ([]OpenRouterModelInfo, error)

GetChatModels returns only models suitable for chat completions. Filters out embedding, moderation, and non-chat models.

func (*OpenRouterProvider) GetEndpointURL

func (p *OpenRouterProvider) GetEndpointURL() string

GetEndpointURL returns the chat completions endpoint URL.

func (*OpenRouterProvider) GetHeaders

func (p *OpenRouterProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for OpenRouter API requests.

func (*OpenRouterProvider) ListModels added in v0.45.0

func (p *OpenRouterProvider) ListModels() ([]string, error)

ListModels returns available models from OpenRouter's /models endpoint. This endpoint is publicly accessible without an API key.

func (*OpenRouterProvider) ListModelsByProvider added in v0.45.0

func (p *OpenRouterProvider) ListModelsByProvider(provider string) ([]OpenRouterModelInfo, error)

ListModelsByProvider returns models filtered by provider (e.g., "openai", "anthropic").

func (*OpenRouterProvider) ListModelsWithInfo added in v0.45.0

func (p *OpenRouterProvider) ListModelsWithInfo() ([]OpenRouterModelInfo, error)

ListModelsWithInfo returns models with full pricing and context info.

func (*OpenRouterProvider) Name

func (p *OpenRouterProvider) Name() string

Name returns the provider name.

func (*OpenRouterProvider) RequiresTransformation

func (p *OpenRouterProvider) RequiresTransformation() bool

RequiresTransformation indicates that OpenRouter needs Anthropic->OpenAI translation.

func (*OpenRouterProvider) SupportsStreaming

func (p *OpenRouterProvider) SupportsStreaming() bool

SupportsStreaming indicates that OpenRouter supports SSE streaming.

func (*OpenRouterProvider) TransformModelID

func (p *OpenRouterProvider) TransformModelID(modelID string) string

TransformModelID returns the model ID as-is for OpenRouter.

type Provider

type Provider interface {
	// Name returns the provider name.
	Name() string

	// GetHeaders returns the HTTP headers for API requests.
	GetHeaders(apiKey string) http.Header

	// GetEndpointURL returns the full URL for chat completions.
	GetEndpointURL() string

	// TransformModelID transforms a model ID for the provider.
	TransformModelID(modelID string) string

	// SupportsStreaming indicates if the provider supports SSE streaming.
	SupportsStreaming() bool

	// RequiresTransformation indicates if the provider needs Anthropic->OpenAI translation.
	RequiresTransformation() bool
}

Provider defines the interface for LLM provider backends.

type QwenModel added in v0.54.0

type QwenModel struct {
	ID      string `json:"id"`
	Object  string `json:"object"`
	Created int64  `json:"created"`
	OwnedBy string `json:"owned_by"`
}

QwenModel represents a model from the Qwen API.

type QwenModelsResponse added in v0.54.0

type QwenModelsResponse struct {
	Object string      `json:"object"`
	Data   []QwenModel `json:"data"`
}

QwenModelsResponse is the response from the /v1/models endpoint.

type QwenProvider added in v0.54.0

type QwenProvider struct {
	BaseURL string
	// contains filtered or unexported fields
}

QwenProvider implements the Provider interface for Alibaba's Qwen API. Qwen provides powerful multilingual models via an OpenAI-compatible API (DashScope).

func NewQwenProvider added in v0.54.0

func NewQwenProvider(apiKey string) *QwenProvider

NewQwenProvider creates a new Qwen provider with the default URL.

func NewQwenProviderWithURL added in v0.54.0

func NewQwenProviderWithURL(baseURL, apiKey string) *QwenProvider

NewQwenProviderWithURL creates a new Qwen provider with a custom URL. Useful for proxy configurations or regional deployments.

func (*QwenProvider) GetAPIKey added in v0.54.0

func (p *QwenProvider) GetAPIKey() string

GetAPIKey returns the configured API key.

func (*QwenProvider) GetEndpointURL added in v0.54.0

func (p *QwenProvider) GetEndpointURL() string

GetEndpointURL returns the OpenAI-compatible chat completions endpoint URL. Qwen uses the standard /v1/chat/completions endpoint via DashScope.

func (*QwenProvider) GetHeaders added in v0.54.0

func (p *QwenProvider) GetHeaders(apiKey string) http.Header

GetHeaders returns the HTTP headers for Qwen API requests. Qwen uses Bearer token authentication like OpenAI.

func (*QwenProvider) IsAvailable added in v0.54.0

func (p *QwenProvider) IsAvailable() bool

IsAvailable checks if the Qwen API is reachable.

func (*QwenProvider) ListModels added in v0.54.0

func (p *QwenProvider) ListModels() ([]string, error)

ListModels returns available Qwen models.

func (*QwenProvider) Name added in v0.54.0

func (p *QwenProvider) Name() string

Name returns the provider name.

func (*QwenProvider) RequiresTransformation added in v0.54.0

func (p *QwenProvider) RequiresTransformation() bool

RequiresTransformation indicates that Qwen needs Anthropic->OpenAI translation.

func (*QwenProvider) SupportsStreaming added in v0.54.0

func (p *QwenProvider) SupportsStreaming() bool

SupportsStreaming indicates that Qwen supports SSE streaming.

func (*QwenProvider) TransformModelID added in v0.54.0

func (p *QwenProvider) TransformModelID(modelID string) string

TransformModelID transforms a model ID for Qwen. Maps Claude model names to appropriate Qwen equivalents.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL